¶ Intro / Opening
Can we be honest about something? This industry is a little obsessed with being data-driven. No, that's not the honest part, you already knew that. The honest part is that not all of us are actually all that great with data. And if you are the kind of person who's pretty handy with an analytics dashboard, you might not always be working with the right kinds of insights to show the whole picture.
My guest today is Mo Hallaba, CEO of Datawisp, which if you're not familiar, is an AI-powered data visualization tool. Given this position, Mo has a unique perspective on the disconnect between the amount of data most organizations are collecting and how effectively they're actually using it. Plus, I've got to say, I rarely speak to people who are so smart they're able to take highly complex matters and make them accessible to anyone.
But lucky for you and for me, that's what this talk is all about. Let's jump in. Welcome back to The Product Manager Podcast. Mo, thank you so much for making the time to join us today.
Thanks for having me. I'm excited to be here and talk about AI and data and all these other buzzwords as well, so.
We're looking forward to it as well.
¶ Meet Mo Hallaba
Can you tell us a little bit about your background and tell us how did you get to where you are today at Datawisp?
Absolutely. So it's long, but I guess bear with me. I started my career in finance, in equity research, and just telling people what stocks to buy and stuff. And that involves a lot of research and like manual data work that is an absolute pain in the butt. I ended up moving on from that and doing corporate M&A for a while. And that was more of the same, but just from a different perspective. So instead of telling people what to buy, we were doing acquisitions.
And so for these acquisitions, doing a lot of our own research as well. And again, like digging through mountains of data. And eventually I got tired of that and I decided, Hey, I want to go out and do something on my own. I want to start a startup. I've always been really passionate about video games. So I thought I would do something in that field.
And so I started this gaming startup that absolutely failed like miserably, but in the process met some people that were working on something really cool, which is an esports team. They were like buying players from unknown like regions and stuff but they were using software to train them using data and analytics. I found that really interesting. So it was like, money ball for video games.
And then from there that kind of grew into its own things because some of the other teams wanted it and some of the other tournament organizers wanted to put it on the screen and show stats and stuff. And so that's how I went from working in finance to working like at a data analytics company. Eventually we decided to grow outside of gaming. So now we're just full-blown, analytics.
Awesome. We'll dive deeply into that in a moment. Actually, it's like perfect segue. So today we're going to be talking about how to really use data to inform your decision-making.
¶ Understanding Data-Driven Decision Making
And to start us off, I'm curious, what do you think a lot of orgs get wrong about so-called "data-driven" decision-making? I know that term is a little bit of a hot button one.
Yeah, I think there's a lot of folks out there who expect data to make decisions for them, like data-driven decision-making. And yeah, it's sure some fields are highly technical and you have data scientists and you decisions that are highly driven by data, but that's not what we're here for. That's not what Datawisp is for.
We're much more about like data-informed decision-making, and it's taking decisions that you were going to make without data and simply just giving you access to some data to look at so that when you make that decision, you're more informed. And so an example of that might be, Hey, where should we put this button in our user interface? Should we put it up here? Should we put it down there? Should we move it at all?
And so if you're seeing that your conversion rates are like people coming into your website and then they're like getting lost on your main page, you might want to move the button, right? But if you're finding that 90% of people make it onto your page and click the button like you don't need to worry about that. It's just little things like that, where instead of guessing, you are now educated by some data. So data-informed.
Yeah, I also prefer data, sorry, data-informed. My Canadian is showing. Can you tell us a little bit about the ways that data visualization can help us to democratize the understanding and usage of data within organizations?
¶ The Power of Data Visualization
Absolutely. I think not everyone is like a numbers person, especially when you go to like non-technical people. When you look at more creative fields or product managers or whatever, some people just understand data better, like when they see it visualized in like picture form in front of them. And they may not like, it's one thing to say, like this pattern is changing over time. It's another thing to just show someone a graph where they can see like how the KPI is trending over time.
And then, if you have a main branch and like a new branch and they're like moving together and then at what point the data like forks in a direction you could, it's so easy to just look at the graph and be like, oh, what happened in July? As opposed to looking at a table or quarterly statements or something like that. Visualizing it, especially for non-technical people, makes it so much easier to know what questions to ask next, in my opinion.
Yeah, as a self professed non-numbers person, I very much endorse visualization of data. What are some of the ways that orgs can safeguard against incorrect interpretation of data, now speaking of which, so that decision makers have a clear understanding
¶ Safeguarding Against Data Misinterpretation
of the data points?
So a lot of people look at AI and they're like, AI is going to replace people. And so we're not about that. Like we're all about a kind of removing some of the busy work and stuff that like bottlenecks, data scientists. Like a great way to safeguard against that is to have a data scientist on your team or to hire someone who actually knows what they're doing.
The difference is that you can get away with one or two data scientists instead of 10 if you have good software that allows people to, basically what you want to eliminate is the need for technical skill to be able to analyze data. So like I shouldn't have to write sequel to be able to get a chart, but that doesn't mean that I should just make decisions off of that chart without understanding what the data means.
Like when you remove that technical barrier, you allow people to spend more time with data and they can become more educated about what are the right ways to use data. And so they have more time to spend talking to a data scientist to understand what that data shows and doesn't show, just as an example. There might be variables that are correlated with each other because like causality or whatever. But basically keeping a person in the loop is the best way to avoid that.
So whether that's a data scientist or like your best engineer, I still think that if your choices are do nothing or do something, you should still go and do something, but.
So what's your playbook for slicing data into contextual chunks? Can you share an anecdote for what that looks like in practice?
I can. The best way to think about it, in my opinion,
¶ Playbook for Slicing Data into Contextual Chunks
is you want to look at what your business goals are. So it's not just about working with data in a vacuum. The reason that we do all this stuff with data is because we're trying to understand something about our business that we didn't know before we looked at the data, right? So usually depends on the business question, but if it's yeah, is our tutorial useful?
If we're trying to answer how good is our tutorial one way to slice the data might be people who have used our tutorial and people who haven't completed our tutorial. And then looking at like churn rates or looking at, dividing the whole data set by people who've done the tutorial and people who haven't done the tutorial and seeing how they differ and maybe there's no difference.
And right, and then we say, okay either the tutorial is not effective or the app is really easy to use and they don't need the tutorial, right? But if you see that people who complete the tutorial, on average, like use the product way more and are way more effective with it and are retained for longer, then you can go, Oh maybe there's something to this tutorial thing, so it depends on what you're trying to solve for in the first place.
That's a really clear example. I really appreciate that. You've spoken before about the difference between actionable and non-actionable data. Can you just talk a little bit about what you mean by each of those things? What's actionable data? What is non-actionable data? And how do we identify and leverage both of those types effectively?
¶ Actionable vs. Non-Actionable Data
Yeah, so it's more about how you are able to act based on the insights that you get. So it's not that the data itself is like different in any way. It's just more what are your options? So I'll try to give you an example, right? So if you are looking at, let's say for Datawisp, right? We have a funnel, like we do marketing. We try to attract people. They come in, they onboard, they use the app, they like it or they don't like it. And then they churn or they don't.
The higher you go up the funnel, the kind of like more there is that you can do. So it's if no one is booking a demo, you go maybe the button's not big enough, or maybe this landing page is not compelling, or maybe we need a new graphic. But if somebody is like logging in, booking a demo, talking to us, trying the product, like asking a bunch of questions and then churning, what's maybe they didn't like it. Or like it, maybe it didn't solve their need.
And that's like way less actionable than some of the stuff like towards the top. So it's just, it's what can you do based on this information? Right? Certain things are more clear than others. So if it's if it's like a funnel where you're just going from step one to step two if you see a 50% drop off in step three, for example, you go, okay you can make a list of things that what effects to drop off from here to here and then you can get after them.
Whereas it's like when you get all the way to the end and they're just like using your product and all that stuff, it becomes a lot harder to go. Okay they churned because X? Because at that point, who knows? The way to do that then is to try to do more customer interviews and watch people using your product. And there's other tools for that.
What are some of the tools that product teams can use to do a better job of collecting and parsing some of their user data to inform the decisions they make?
Yeah, so it depends. There's a lot of different software that you can use to and it really depends on what kind of product you're building. If you're building a game, it's very different than
¶ Tools for Collecting and Parsing User Data
if you're building like a website or like an ecommerce platform or whatever. But you need something that collects events or like the tracks when things happens and stores them in a place that makes it easy for analytics. Ideally, if you're building a game, there's like unity plugins that can take events that happen in game and record them and store them in snowflake, right? If you can get something like that makes it way easier for you to start actually analyzing that data.
That's like the basic requirement. For ecommerce and stuff, there's this ton of like vertical specific solutions that do stuff like that. So I would say just depends on what you're building.
As far as AI and the role of AI in assisting with some of just the parsing of data, you'd mentioned having a data scientist on staff is your best option or several. Is there a place for AI in terms of helping to make some of the data make more sense to folks, especially non-technical folks?
Absolutely. I mean, that's our whole business at Datawisp, right?
¶ Leveraging AI for Data Analysis
So like in terms of what the AI can help with is the kind of like technical skill that is required to be able to parse data, to be able to transform and shape it how you want to get the answers that you want. The AI is really good at that. And so this is something that only data scientists could do before. But now like anyone like me, who's completely non-technical can use one of these AI tools and the AI just takes care of the code part of it.
And you can just ask questions in plain English, right? So anybody with a bit of curiosity, and like a basic knowledge of the business or like the product can start digging into data. And I think that's cool because like it creates this cool flywheel effect where the more people care about data, the more the company is going to invest resources into having nice, clean data and making it available to people and all that stuff. And so, and then they benefit more from having that.
So let's talk a little bit about leveraging data for iteration. So once you've applied your data and applied learnings to a specific feature, for example, how do you create a feedback loop to inform future iterations of that feature?
Absolutely. I'll give you an example of something we did at Datawisp, actually. So we, with Wispy, our AI, we care very much about the number of people who are using it, and more importantly, like, how long they're willing to wait for an answer.
¶ Iterating with Data: Real-World Examples
So we had the option early on of having a really fast Wispy that would return okay answers, and having a much slower Wispy that would return way better answers, but not always way better, but sometimes way better. And then we like quickly find out by looking at data, like how long people are willing to wait because you see after a certain point that people just drop off like they're not willing to wait more than 30 seconds to a minute, right?
And like our smart version of Wispy would take sometimes two minutes, but it would return like stuff that was really cool and nobody would sit around and wait for that. So we decided we wanted to focus on the speed one. And so we tried to make Datawisp as fast as possible. And you can see the metrics go up. Like, when we switched from, so at first it was, like, the slow one. And then it was a 50-50 split where we gave half one and half the other. And then we just went to the fast one.
And you see the numbers go up as we go through this iteration. And this is the sort of thing where like Moritz and I, Moritz, our CTO, were on a call with each other and we were designing these little messages that come up like as Wispy is working on the answer. So if it takes 30 seconds, you can't just like spin for 30 seconds, right? So it had these like messages, Hey, I'm still working on it or data science is hard or whatever.
And we were trying to figure out like how long the timing should be between these messages. And then we were like, wait, we could just look at data. And we just looked at how long people were willing to wait and we just spaced it out according to that and we rolled the feature out. That's like a concrete example of how you iterate on that stuff using data.
So that's really cool and I kind of wonder because we often talk about training AI models and like what that looks like. Is there an element of training users that sort of, because when you say that there's an element of impatience when it comes to users waiting for a higher quality answer, that seems like it's a trainable behavior, like what's, how do you approach that decision?
You're holding it wrong. That's not how you do it. You don't remember that? When, was it Steve Jobs? It was like, you're holding the phone wrong. It was the famous thing, people like the new iPhone came out and like it wasn't getting good reception because like they put the antenna somewhere where your hand was and they just came out and said, Hey, you're holding it wrong. You should hold your phone like this. That's an example of not the right way to do it.
That's the absolute worst way to do it. In general, it's about expectation setting. So it's a mix of like education and also expectation setting. They go hand in hand, but at the end of the day, like the user does have an idea of what they want. And so you have, so it ends up being a compromise. We understand that what Datawisp is doing is for non-technical people. And the existing behavior is I'm going to email Joe and Joe is going to get me the answers, right?
So these are already people who, they don't have to put in an awful lot of work to get something, right? Now, it sucks for Joe, but that's the reality of this person, right? So we can't ask them to do a whole lot of work. We can ask them to do a little bit of work. But if it's more than that there, at the same time, it is some marketing and some expectation setting. What do you want? A good answer? When you want to allows the answer. How long does Joe take to get back to you? A week?
Is it worth waiting a minute to get an answer from Datawisp? 30 seconds would be nice, but a minute is still shorter than a week. And so it's, you drop little hints and you need like a good copywriter for this, like on your website, setting expectations right from the beginning. Hey, get your answer in minutes, not weeks. And you don't put seconds because if you put seconds, they expect it like this, right? So if you say get an answer in seconds, that causes an expectation mismatch.
So it's funny. I say all that and then our website probably still sends seconds. But I think like in sales calls, you just have to be really upfront with the customer and say, look, if this is something that takes you how long now, and they tell you. And you go, okay we can do it in a fraction of that time. And so you set expectations a bit.
Yeah, it's interesting. I feel like there's an existing behavior from folks who are so used to using search engines that answers should come instantaneously when the technology is very different.
Also ChatGPT has like spoiled people because people just think AI is magic. They don't understand how it works at all. And also, they've seen how fast it can produce a lot of text, for example. So if I say, hey, ChatGPT, write me a travel itinerary for a five day trip to Milan. It'll just go, boom, done. So people, regular people just assume this is magic and assume that it can do anything quickly. But there's a difference.
I mean, like when you're connecting to a database, you've got a billion rows of data in that database. The AI is not even the part that takes a while. Like you, you have to run that query and that query may take some time to process. And there's absolutely nothing we can do with that about that. But because someone saw ChatGPT be really fast, they go this should take two seconds.
And so you have to, one of the things that Wispy does now that it didn't used to do is when it's spinning, it actually tells you what state it's in now. So now it goes, I'm thinking, and then when it's done thinking, and it's just waiting for your database, it goes waiting for your database. So it like communicates what's happening at all times to set expectations throughout the process.
I can see that being really helpful. I think that also users are really trained to when something is spinning for what they perceive to be too long, the assumption is that it's broken or, you need to resend the, very interesting.
So we looked at a lot of data of this. We looked at how long, and first of all, track everything. So as soon as you have a live product, start tracking things. So we tracked everything, like every button that you can press in Datawisp, we track every time anyone presses it. And so we have a really good idea of how all this stuff works.
And so we also did user tests and yeah, we noticed that if something was spinning for a while, people just refresh the F5, the, so you like you really, the product has to communicate what it's doing if it's making people wait for more than five seconds.
That makes a lot of sense. And yeah, I think that track everything could be the tagline of this episode.
Yeah, track everything that your users do in your product because it will be useful at some point.
So I know you already shared an example of data informing a decision. I think that was a really good concrete example. I'm curious what the most triumphant example of data informing a decision, whether it's, it's a customer, whether it's your own team.
This isn't at Datawisp, but the most triumphant, I told you I worked in esports before, the most triumphant is we literally won a tournament based on this stuff, based on data. So the way that this software worked is actually really cool. And I can give you a Datawisp example as well if you want. The way the software works is really cool. It allowed, like you said, to filter data
¶ Triumphant Data-Driven Success Stories
based on certain different conditions or whatever. And so the most kind of tools would give you heat maps. So this is the whole map and this is like where the other players on the other team usually hang out. Right? And if you look at a heat map over an one hour long game, it just looks like green everywhere. But that's not helpful. If I'm in a game and the whole map is green, that doesn't tell me where they are. But what was really cool is when you started breaking that down by situation.
So on the first round of the game, where are they? When they have this much money, where are they? When they don't have money, where are they? When they have these guns, like where do they stand versus when they have different guns? And so you can start to create these scenarios where like all of a sudden the heat map is like five red dots and now you know exactly where your opponent is.
And so you can use that stuff in competition and like they were using it and winning with the lowest salary budget of any team in the competition. So it was incredible.
That's awesome. I'm sure that there's some esports folks salivating over that right now.
The best Datawisp example I'll give you is we helped a game basically triple their user base in a very short period of time. And the way they did that was they segmented the users based on what like acquisition source they were using. And then they looked for specific in-game behaviors that they like valued more than others.
And so they were able to find when we acquire users through this channel, they behave in game the way that we like, and so they're able to focus their efforts on certain acquisition channels that they preferred. The company's called Honeyland. We did a case study with them. It's on our website. So that was really cool.
Yeah, that's really interesting. I'd like to check that out. Mo, thank you so much for joining us today. Where can people follow you online?
So, if you're on X, it's ElectronicMo. Otherwise, you can follow me on LinkedIn. It's just my name. Datawisp is the same, so you can follow Datawisp on X and on LinkedIn as well.
Awesome, thank you so much for joining us and yeah, for the crash course, it was very accessible for non-data folks like myself.
I try to make it simple. Thanks so much for having me, Hannah.
Thanks for listening in. For more great insights, how-to guides and tool reviews, subscribe to our newsletter at theproductmanager.com/subscribe. You can hear more conversations like this by subscribing to The Product Manager wherever you get your podcasts.