Your Digital Footprint: How Big Data Shapes Who We Are - podcast episode cover

Your Digital Footprint: How Big Data Shapes Who We Are

Nov 13, 202429 minSeason 4Ep. 253
--:--
--:--
Listen in podcast apps:

Episode description

✨ There are more pieces of digital data than there are stars in the universe.

Ever wonder what your digital footprint reveals about you? In today’s world, every online action builds a profile of who you are—and influences your choices more than you may realize.

👂Listen in and remember to like, comment, subscribe, & share 🧡

📌 Key Topics:

🔍 Understanding the Digital Village: how our digital lives mimic small-town dynamics, where everything we do is “seen”—but by algorithms instead of neighbors.

💡 The Influence of Big Data on Behavior: Discover how algorithms use your online behavior to shape everything from ads you see to your decision-making.

❤️ Data's Potential for Mental Health: Learn how psychological targeting could improve well-being by identifying those in need of support and tailoring mental health resources.

In this episode, we sit down with Dr. Sandra Matz, Associate Professor of Business at Columbia Business School, to explore her research on how our online activities reveal intimate aspects of our personality and behavior. Her upcoming book, MindMasters, dives into the science of predicting and influencing human behavior through data. In this episode, Dr. Matz shares insights on both the positive and nefarious aspects of big data, practical ways to manage your digital footprint, and the potential for AI to improve mental health.

🌟 LET'S STAY CONNECTED…

#DigitalFootprint #BigData #MentalHealthMatters #AnxietyAtWork #DataPrivacy #AIandEthics #WorkplaceCulture

Support the show

For a weekly dose of gratitude from Chester Elton, text GRATITUDE to 908-460-2820.

Until next week, we hope you find peace & calm in a world that often is a sea of anxiety.

If you love this podcast, please share it and leave a 5-star rating! If you feel inspired, we invite you to come on over to The Culture Works where we share resources and tools for you to build a high-performing culture where you work.

Your hosts, Adrian Gostick and Chester Elton have spent over two decades helping clients around the world engage their employees on strategy, vision and values. They provide real solutions for leaders looking to manage change, drive innovation and build high performance cultures and teams.

They are authors of award-winning Wall Street Journal & New York Times bestsellers All In, The Carrot Principle, Leading with Gratitude, & Anxiety at Work. Their books have been translated into 30 languages and have sold more than 1.5 million copies.

Visit The Culture Works for a free Chapter 1 download of Anxiety at Work.
Learn more about their Executive Coaching at The Cultur...

Transcript

There are more pieces of digital data than there are stars in the universe. Some people believe that manipulation of big data holds enormous potential to improve our lives while others believe it could be one of the greatest threats to humanity. Hello I'm Chester Elton and with me is my dear friend and co-author, Adrian Gostick. Yeah, thanks Chester for that very ominous start. Great, yeah, this is gonna be fun. Our guest today is gonna help us understand what our digital footprint reveals about us and then how to manage psychological targeting and maybe redesign the data game. As always, we hope the time you spend with us will help reduce the stigma of anxiety at work and in your personal life. Yeah, and with us is our new friend, Dr. Sandra Matz, Associate Professor of Business at Columbia Business School in New York City. As a computational social scientist, she studies human behavior and preferences using a combination of big data analytics and traditional experimental methods. Her research uncovers the hidden relationships between our digital lives and our psychology with the goal of helping businesses and individuals make better decisions. Sandra's new book, launching in January, is Mind Masters, the data-driven science of predicting and changing human behavior. Sandra, we are delighted to have you on the podcast. Thanks for finding the time. Well, thanks so much for having me. Well, this is gonna be really interesting for us. I don't think we've done a big dive on big data before like this, but what I find is interesting is you start your book not with big data, but by talking about growing up in a small village where everybody knew your business, then to getting into your research on this macro computational psychology. So first off, big picture, what do you want us to all know about how data is impacting our lives? Yeah, well, thank you so much. Yeah, it's funny, because I grew up in this tiny place, tiny village in the southwest corner of Germany. It's like 500 people. I think my parents keep reminding me, but by now it's 1,000, which I can attest doesn't really make much of a difference. And so my experience growing up in this village was really the fact that everything I did was visible. So you have 499 other people that observe everything that you do. It doesn't matter whether it's who you're dating, what you're doing on the weekend, whether you get good grades in school. So they kind of are pretty deep in your life. And what that means is that they can essentially take all of these observations and learn something about who you are. They see you running to the bus in the morning, and they probably get a sense that you're not the most organized and maybe reliable person. They know that you can't say no really, and they infer that you might be one of these nice, trustworthy guys. So it is like all of these inferences that your neighbors make. And the way that I described this in the book is essentially we all now live in a digital village, right? You don't necessarily have to have your neighbors poke around in your business, but you leave all of these data traces that the same way that my neighbors put together this puzzle of who I am. Now, algorithms can do the same thing with your data. So they can read the messages that you send, they can look at what you buy, they can look at the GPS records that you create. So getting a really detailed sense of where do you go? What do you do? Who do you meet? And that is really kind of a very intimate insight into who we are. And then back to the village, my neighbors weren't just in the business of poking around in my private life, right? They were also trying to change my behavior, change, interfere with the decisions that I made, as good neighbors do. And sometimes that was amazing. I think like looking back, it was the time that I felt most understood, that I felt the most supported, because they knew exactly what I wanted and they were there to offer advice and offer support. But also, as you can imagine, it felt quite annoying at times, because they were, again, poking around in my business, trying to get me to do stuff that I probably didn't want to do. Fast forward, the same thing is true for algorithms. So once I know that you might be more impulsive or you might be more neurotic or you might be more agreeable and nice, I can actually use these insights to influence the choices that you make, and not just for an individual but for really thousands and millions of people at the same time. Yeah, it's interesting you talk about that physical footprint and then your digital footprint. The way Adrian keeps me out of his life is I live 3,000 miles away from where Adrian lives at separation. That's why I'm out of your business. That's it. Yeah, yeah. What are you doing today? It's a good reference, because it used to be what happens in a village stays in the village, right? So the worst case is it kind of spreads to the next five villages. But if you did something and you wanted to get away from this, you just move to the next big city and it's forgotten. That's not true anymore. So whatever you leave in a digital footprint, that's going to be out there for anyone to see. Yeah. And it's interesting, right? The more platforms you are on, your credit card data is now part of your digital footprint, your bank accounts, and all of that, right? So talk about that digital footprint and what it reveals about us now. You talked about it being used to influence our decisions, maybe without us realizing it. That's the nefarious part, right? Can you expand on that a little bit? Very much so. So let me just take it, because I think what's really interesting is that oftentimes when people talk about like machine learning and AI and predictions, it feels very much like a black box. What I love about this research is that you can open it, and you can look at these relationships the same way that you could figure out what is it that your neighbors are using in terms of cues to make inferences. So sometimes the relationships that we find are really intuitive and obvious. If you look at what do extroverts talk about on social media, they talk about parties, weekends, doing fun stuff with friends, and then you look at introverts and they talk about reading books, browsing the internet, reading... So you can kind of literally see the people kind of standing out in the data. Sometimes it's surprising. So sometimes we find stuff that we didn't really kind of see coming. Extroverts, for example, they use all caps, all the time. So they're really kind of shouting the stuff that they're talking about. And some of the surprising relationships that I found really intriguing was mental health and people suffering from depression. They use more references to the self. So they talk more about I, me. And I remember hearing about this relationship and I was like, that sounds more like a narcissist, right? People very much focused on the self. But you can imagine that if you are having a hard time, it's very difficult to think about the problems of the world. What you're focused on is like, what's happening in my life? How do I get better? And so those are the surprising ones. Then we have ones that are funny, right? So you look at people who are a bit more critical and competitive, like the disagreeable personality traits. And they, if you look at what is predictive about them talking on social media, they just use a lot of swear words. And so they're not happy to hold back. And they also, one of the interesting ones was the Facebook pages that they follow. It's like one of the number ones was Prada. So confirmation of the devil wears Prada. But then again, you have these funny ones that you show worry clouds and people are like, isn't that like a fun way of poking around in people's private lives? Some of them are very uncomfortable. So we studied, for example, relationships between what people talk about on social media and income. And what you see is that low-income people, not only do they not talk about vacations and luxury products as the high-income people, as you can imagine, but they're also, again, much more focused on the self and the presence. So I think that's a pretty condemning picture that we're painting about society because it's just really damn hard to be constantly struggling with your finances. So you see this being borne out in the data. And what makes algorithms, in a way, so powerful is that they can take all of these data points, like these individual data points about you, and they put them together. I think of each of these relationships as like a puzzle piece. And an algorithm is amazing at taking all of the puzzle pieces together and putting like this picture up of who you are. Now, for me, in a way, this is like the intriguing part, right, as a psychologist. It's amazing what we learn about human behavior. The part that I think is both an opportunity but also like a pretty big risk is the second step and that's what you were referring to just earlier is once I know, again, that you're more extroverted, more introverted, more impulsive, more neurotic, I can actually use it to influence your behavior. So we started investigating this disability of algorithms in the context of, on relatively in a relatively neutral grounds, I would say, in a context of just selling products, right? So we teamed up initially with a beauty retailer and the goal was just to get people, like in this case, women, to click on an ad and then go to the website and buy something. And the idea was that if we can frame the same beauty product in extroverted terms, so it was always like people dancing and there's a lot happening, and the copy would say something like, beauty, no, dance like no one's watching, but they totally are, so playing with the need of experts to be the center of attention, right? And then the introverted ones would be something like, dance like no one's, beauty doesn't have to shout, so it would be like one person in front of the mirror, much more quiet. And what we showed there, and again, this relatively neutral setting is that we could increase, in this case, purchases by about 50%. Just again, it's exactly the same product. We're just talking to people's needs, right? We're kind of tapping into their psychology and we're trying to figure out what might make this beauty product relevant to people. Now, do I mind if people sell me toothpaste that's targeted to my personality? Maybe not so much, but getting me to vote in a different way, probably harder. Yeah, it's funny because Adrian dances like everybody's watching. So true. So true. Yeah, many people have complimented my dancing. Yeah, it's interesting. I was in Arizona on Tuesday, two days ago, for those who aren't following this real time, and it was still a day before the election, so there was election signs everywhere. My car driver said, he just had the blue, he says, Al, these election signs, at least they're going down today. He says, I'm not influenced by any of this. And it was interesting. And I knew that we were going to be talking a couple of days. And I thought, you probably are more than you know. Not by, you know, placard out on the highway, but by all that's going on with big tech. I mean, there's just a lot that we don't realize is happening, like you say, with the algorithms behind the scenes. And for me, that's the critical part. I think we oftentimes talk about invasion of privacy. For me, the much bigger problem is the fact that we're losing control over our lives, right? So the choices that we make are no longer just our own. We're losing agency. We're losing self-determination. And I think that kind of gets people to listen up a little bit more than saying, well, you're losing your privacy. Well, that kind of brings us to the question then, okay, do I just go into a bunker? Is there, giving away our data creates a lot of anxiety. This is Anxiety at Work, the podcast. Is giving away my data even worth it anymore? Because almost every month, probably, I get some letter or some notice that, oh, your data has been breached by this company or that company, and it's out there. So is giving away our data worth it? What can we do to bring that anxiety down? Yeah, it's such an interesting question because what I think is true is like, first of all, I think top of mind are usually data breaches. Right, so that's, and I understand because that makes you feel incredibly helpless. That's like someone breaking into your home. There's like no control that you have. For me, the bigger problem is that most of the time we're actually voluntarily signing away our data. That's more like inviting someone to your home and then just giving away the stuff for free, right? So we kind of all use products like social media that capture a lot of the data that we generate. But then also more often than not, we just are completely mindless about, for example, downloading apps on your phone. Like most of the time, the apps ask to tap into your microphone to access your gallery, to collect your GPS records continuously. Those are incredibly intimate data points. And for me, that's the even scarier part. Now, the question of whether it's still worth it, I think is a good one, because we gain a lot from using these technologies. We get better products, better services. It's more convenient. Now, the question that I often have to people is like, OK, it's worth it, right? So sometimes that might be true, not always. But if you could have it all, let's say you could have better service convenience and better products, but you could also preserve your privacy and have at least some control over the decisions that you make, and you have some understanding of what companies are doing with your data and what they're using it for, wouldn't you actually want to have it both? And I think that's the question that we should be asking. It shouldn't be like, is it worth it? Am I willing to give up my privacy and the ability to make my own choices for better services? You should be demanding both. Well, you know, it comes back to this mental health issue, right? You say that AI and psychological targeting might actually be good, right? It's going to improve mental health, help you make better financial decisions, break us out of the divisive echo chamber. Walk us through those ideas, the good side, right? Yeah, and I think it's so important to mention those good sides, right? It's all too easy because what we see in the media oftentimes, for good reason, is focused on some of the risks that we have when it comes to data, when it comes to algorithms and predictions. But there's a lot of potential. So I've been thinking about this really in these what-if questions. And so the first example of like, how do we help people make better financial decisions, for example, was born out of my initial research trying to sell people products. Right. If I tap into your psychology, I can get you to buy something. The what if question is, could we use the same mechanisms to get you to save more, which presumably is in your best interest if you look at how many people want to save more for in the new year and then fail to do so. So we essentially show that if I, again, I understand you extroverted, introverted, agreeable, disagreeable, I can just make saving more appealing. So for extroverts, I could say put some money to the side right now so you can then spend it on these really fun activities with your friends. For an introvert, I might say something like, well, if you put some money to the side right now, you might actually be able to make your home more comfortable. And again, what we show there, the same way that we can get you to spend more, we can also get you to save more. The two topics that I'm personally really excited about is first of all, mental health. You mentioned that, and I think there's incredible opportunities for data, both in terms of tracking mental health and treating mental health. So the tracking part is, for example, some of the work that we've done is looking at, can we predict whether someone might be suffering from, like say depression, by looking at the data that they generate with their smartphone. So you can imagine, for example, that if I tap into your GPS records, I see that you're not leaving the house as much as you typically do. It is much less physical activity. If I look at the call logs, you might not be taking, making as many calls. Now that might be nothing, right? Maybe you're just on vacation, but it could be that you're entering this depressive episode and now would be a good time to actually go and look for help. Because once you're in the valley, it's really difficult to get out. Ideally, we catch it as early as possible. You could even think about having a system where I nominate people that if my kind of smartphone sees that I might be entering a depressive episode, maybe it's gonna call my spouse, maybe it's gonna call my parents to see if they can be there to support me. And then there's also a lot of opportunities for treatment. So the easiest thing would be to have an Amazon algorithm for mental health interventions, because we know that it's not the same treatments work equally well for everybody. So the same way that Amazon can recommend your products that are tailored to your preferences, or Netflix recommends movies, we could find the best interventions for each and every single individual. And then you could also, I think, the common, the most interesting current development, which also has challenges, is essentially using generative AI as a complement to existing therapists. If you look at the gap between how many people are looking for therapy and how many therapists there are, it's enormous. So it is for every 100,000 people looking for therapy, there's exactly 13 professional therapists. And that's across the globe. You can imagine that there's like a pretty wide kind of variance. So in some Upper East Side in New York, you're probably going to find a therapist. If you're somewhere more remote, much, much harder. So there's a way in which at least the people that are currently underserved could get could benefit from having these chatbots that at least check in and give you some recommendations. So this is, I think, the mental health space is just like a really interesting one. And then the last idea, which is one of my favorites, just because I haven't heard that many people talk about it, is this idea of echo chamber swaps. So typically, when we kind of think of algorithms and psychological targeting, right, it's like, oh, we're going to be put into our echo chamber. It's like a filter bubble because the algorithm is going to pick up on what we like and what we believe in already. And now this is going to recommend more of what we want to hear and see. And that's true because it's usually profitable, right? It's very comfortable in our own echo chamber. And it's nice if we don't have to go to page two of Google searches. But what you could do with exactly the same algorithms is say, I want to hop into the echo chamber of someone else, right? So I don't know what the life of like a 50-year-old farmer in Ohio looks like. No idea. It's really difficult for me to do this, right? I would have to go there, talk to a couple of people. Now, Google knows exactly what they see when they search for, say, immigration or anything else. They kind of showed that to that person, but they could also show it to me, right? So Google could have an explorer mode, Facebook could have an explorer mode, where they let me say, I want to hop into the echo chamber of someone else. And maybe I don't like it as much there, right? But at least it gives me a way of seeing what other people are seeing, and maybe broadening my perspective on the world, rather than making it narrow and narrow the way that we're doing it right now. Wow, that brings up so many questions. Yeah, yeah, this is so great, Sandra. How do people learn more about your work? Where would you send them to learn more about you? I was going to say, this is like what happens when you invite academics. You give them the stage and you ramble on and on, and you finally found someone to listen to this stuff that they care so much about. And yeah, well, hopefully through your podcast, that's probably number one. And then in the book, so I really kind of tried my best in Mindmasters to just make the topic relatable, to tell the human side of data, to explain like machine learning, AI, psychological targeting in a way that people can relate to. Because it's funny, initially I didn't want to write a book. That was not how it started. I started writing like these small stories for my parents and friends because I wanted to kind of just tell them like what I was doing and maybe get them to be a little bit more mindful about their data, make my research a little bit more engaging. I think that the book is a good one. I don't use social as much, partially because that's the topic of my research, but also just because it takes so much time. I don't know how people manage their... This is like a full-time job, so I just can't do it. But I do give a lot of talks, some of them on YouTube. I have a website. This work that you're doing is so important. By the way, we have great respect for academics like you, because we get to go in and lecture to classrooms now and then at university level, and we know how hard it is to keep people for an hour a day, three times a week, for four months in a row. We only have to go do that for an hour or two once in a company, and then we get to fly away. So great respect for what you do. But then you've also written 14 books and I've written this one, so it is like a up and down side. Not to correct you, but it's 15. It's 15 by now. And so we're getting, well, the time has flown by. This has been so fascinating. You know, I was just, I mean, I bumped into somebody at the grocery store a few weeks ago and they said, you know, with data, I don't even know what to believe anymore. You know, and you probably hear things like that. There's so much out there. People get stuck in their own echo chambers and then there's so much floating out there. We don't know what's real and what's lies anymore. How do you respond to that? I guess my final question, a tough one for you. So it's a big one. So generally speaking, I think the one thing that I would love to change about the way that people think about data is just breaking down this false dichotomy that I think is propagated by Silicon Valley that is like, well, either you can totally buy into all of this technology and you can benefit from the service and convenience and the products, but for you to be able to do this, you need to sign away all of this data. And I think that we know is just no longer true. So there's all of this new technology, for example, that allows you to locally process stuff on your phone. So you don't, Apple, for example, is using it. So instead of sending all of your speech data to Apple to train Siri and kind of get better at speech recognition, they can actually send the model to your phone, update it there, learn to recognize your voice, and then they can just send the intelligence back. So there's all of these ways in which we can now give you the service and convenience, while also to some extent protecting your data. So to me, it really comes down to control. It's like making sure that people, first of all, understand what they're signing up for, and then also have control over the purposes that the data is being used for. Being supported by technology and regulation. Excellent. Two things as we wrap up. How do you take care of yourself? You already say you're not, you don't have a full time social media job. So what are some of the things you do to manage your anxiety and digital footprint? And then if there's like one or two things you want people to take away from the podcast, what might that be? Yeah, it's a great question. I think the one thing that helps me the most is probably just going on walks. I love, so I have a dog, which is a great way of getting out first thing in the morning, taking a lunch break, and then at night, right before you go to sleep, there's like a very nice break from work. So that I absolutely love. I used to take naps, used to love it. Now I have a baby, so the naps have just been absolutely killed. But if you have the chance to take naps during the day, for me that would just work wonders. It's like this very quick break, give some time for yourself, and then I was just always so much more energized after. What do I want people to take away? So for me, it's really this notion that, again, coming back to this village, is that just the data that we leave is so much more intimate. And it's not just here's something that you bought, here's something that you've liked, but it really offers this window into your psychology, right? And it's not just like peeking into your psychology, it's also potentially writing into your minds, right? It's changing the choices that you make. So I think what I would love people to do is just demand more, like demand more of the companies that you buy from, demand more from the politicians, because we can't do it alone. But I think this idea that we just tell people what's happening with their data and then they manage it all by themselves because they fully understand technology and they have nothing better to do than reading all of the terms and conditions 24 hours, it's just like not going to happen. So I think we just need to all demand more of them, the people using data and the people with the power to regulate it. Excellent, excellent. Listen, it's been a delight to have you on the podcast. You've gotten so much great information. New book coming out in January, Mind Masters, the data-driven science of predicting and changing human behavior. We're delighted to have Dr. Sandra Matz on the podcast today. Thank you so much. We hope you sell a million. We do. Thanks, Sandra. Thank you so much. So Chess, another fascinating podcast. You and I, I think we just do these because they're fun and we get to learn so much. And what an interesting way of thinking about data. You know, you come in, there's so much fear and so much trepidation around it. And what Dr. Sandra Mance is telling us, look, there's some interesting ways to think about it too. But I love how she started this with, it's like living in a little village where everybody knows you. If you run to the bus, it's not that, oh, he's running late. No, he's disorganized. He's not. We're making inferences with the data that we get. I thought that was fascinating. Yeah. It was also interesting that she said, in a small village, even though they know everything about you, for the most part, it's positive. They're trying to help you. Exactly. Yeah. Be better organized. Get to the bus on time. I'll help you with your grades and all that kind of stuff. Now, everything is digital. I mean, you're sending it out to not just the 500 people in the village, although I thought it was up to 1,000. They've almost doubled their population. It's going out to millions of people. And how they can parse it so perfectly. They can know if you're an introvert or an extrovert, your income level, where you travel, where you don't travel, and all these things, again, to influence you, whether it's buying cosmetics or how you vote or how you see the world. Of course, we're post-election now. We saw that just in massive quantities, the texts that you got every day, the emails. You couldn't watch a football game. Every commercial was a political ad, you know. And it is extraordinary. The thing that really impressed me when she said, when we talked about data breach, she said, you know, you feel like people are stealing your data. They're not, you're giving it to them. You've invited them in. You said, take my photos, take my microphone, take my, you know, my tracking. It really is extraordinary, the amount of information we freely give. Yeah. And not really realizing how it's being used. Younger people are a lot more aware than people in our generation of this. But still, as you were mentioning, everything demographically about us, but also what she was saying is it's our proclivities. It's the way our mind works. Maybe we're neurotic. Maybe we're self-absorbed. Maybe we're angry elves. And what they're doing is they're playing on that. They're selling things specifically to me based on how I am. But also then on the good side, I thought this was, could be really helpful is that, look, you haven't been out of the house in a few days. You haven't called anybody in a few days. Can, you know, and it's a prompt to say, you know, take a shower, go get, go, go for a walk, you know, but it's also could be used as a complement to therapy. As she was saying, we have nowhere near enough therapists to help all of us who may struggle. So I think that's really powerful. Yeah, always the good side and the scary side. We're very quick to look for the negative. Well, you know, I think somebody who always brightens my day and is the positive side is our extraordinary producer, Brent Klein. I don't know if he has that effect on you, Adrian, but it's like a ray of sunshine every day to me. Yeah, whenever I'm around Brent, yeah. And of course, Christy Lawrence too, you know, our booker who helps us find amazing guests. And to all of you who listen, we know you've got a lot of things on your plate that you give us your time. We really appreciate that. If you like the podcast, please share it. You know, it's all about mental health and things to do that can manage your anxiety. I love Dr. Metz, she said, I take my dog for a walk. You know, these little things that you can do to lower your anxiety. We'd love to have you visit thecultureworks.com, it's our website for some free resources to help you and your team and your culture thrive. What else, Adrian? If you haven't picked up a copy of our best-selling book, Anxiety at Work, please do so. We also love speaking to audiences around the world, virtually or in person, on the topics of culture, teamwork, resilience, ideas like this. So give us a call, we'd love to talk to you about your event. So Chess, I'm gonna give you the last word because that's just how I roll today. Yeah, well listen, hope you have a great week. Hope that you're managing your mental health and that you have helped other people around you and your families. We wish you the best of mental health and a great week. Cheers. We wish you the best of mental health and a great week. Cheers.

Transcript source: Provided by creator in RSS feed: download file