The Busyness Paradox
Episode #6: Margin of Error - When Good Polling Goes Bad Transcript
[Music Intro]
Frank Butler 0:00
Hello busybodies, this episode is a little different than our others. It's focused on the presidential election polling and the issues that happened with that. We are doing this episode mainly because we think the process of asking the right questions and trying to find the answers through a variety of approaches is important. And what we're trying to do in this episode is merely point out how you can ask a question and get what you think is the right answer. However, you might not actually be getting the right answer for your needs. So that's what we're trying to bring up here and discuss a variety of methods of making you think a little bit differently about how to collect the right data that you might need, or at least triangulate your information. We hope you enjoyed this episode.
Hello, busybodies. Welcome to another episode of the busyness paradox. I'm Frank Butler, here with my co host, Paul Harvey.
Paul Harvey 1:14 Good day.
Frank Butler 1:15
And on today's episode, we are going to discuss what the heck happened with the polls. Now, this sort of come up because Paul and I have both done survey research, we have written work that's used stats around surveys and such. And so, you know, we kind of have a, I guess, an understanding of survey research. But polling is a little different. But at the same time, it's not. There's a lot of similar techniques that go into it, statistical analyses that go into it and such. And, you know, this is the second presidential election that we've had, that the polls have been so dramatically off, you know, that the polls kept saying the majority of the polls, not all, but the majority of them and those that they consider to be, you know, these top rank polls kept saying, Oh, yeah, you know, it's gonna be this huge blue wave, and we're gonna see the democrats picking up more seats everywhere. And these races are going to be divided by much bigger gaps, and blah, blah, blah. And we didn't see that happen.
Paul Harvey 2:17
Yeah, and it's really not...you’re right it’s the second time in a row, this has happened, but it's not the second time in history, it's become probably more common than not, since the 2000 elections really kind of kicked off the “something's wrong with polling” craze.
Frank Butler 2:34
I think what's happened is that they keep getting further off. I mean, you looked at, you know, the divide between what Biden was projected to get and what we're currently dealing with right now. And and, you know, while you guys are listening to this episode, that decision and outcomes are going to have been made. But we're still in the midst of trying to wait for the results of a few states. And, you know, that's what's so interesting is that this was supposed to be a very clear cut decision. And it was, it was so much more that they even showed polls that corrected, based on if there was a miss in the polling based on the 2016 election, and even the 2012 election, they're like, if we correct for how badly we missed, then it was still showing a blue wave and a comfortable lead in many states, for Joe Biden, and even democratic senators and House of Representatives. And it turns out, we didn't get that. And me being kind of the stats geek that I am and trying to understand the why I started reading a bunch of different things, you know, not really getting in any depth.
But the first thing I saw that really stuck out to me is that there's a 3% response rate. And the average listener who's not privy to how stats for surveys work, or what have you, or the research journals, if Paul and I did a survey, and we sent it out and got a 3% response rate. And then we try to submit a journal article based on a peer reviewed journal article. And I'm not talking about good journal that's like, you know, super awesome in our field, about a basic mid grade journal, with a 3% response rate, we would get blasted in primarily because one, the response rates too low. And it probably indicates that there's some level of biasing or reason that people are not responding to the survey, there's something to it that might be either confrontational in some way or disagreeable about it, there's a lot of different elements to it. So that 3% that is responding is probably not a good, true representation of the population that you need to capture to get a good assessment of trends. And what was even more frustrating for me and I in general take most visuals on stats that I see on TV with a grain of salt in general, oh polls are saying this with a margin of error of three to five years blah, blah, blah. Most of it I take with a grain of salt because I don't know how they actually went about assessing and who they asked you know, they don't do a very good job of that. pollsters are even worse because they make money off of this. And so therefore, they are very, very, very tight fisted with the types of questions they ask the analyses that they're using to come up with these results. And even how they're correcting for maybe some of this biasing that goes on, and they're missing the boat by a country frickin mile.
Paul Harvey 5:19
Yeah. And, you know, I think there's so many angles to this, I'm not even sure which ones to dive into. First, back to your point about the 3% response rate? It's my understanding, at least is that it's kind of an evolving science in its own right, within the polling community, on how do you accurately make such large scale predictions on so few responses, so that most of the poll modeling is really designed. And the analysis is designed with that very thing in mind, that we're going to take 1000 people responses and predict an entire state or an entire country's voting based on that. How do we do it? So there's some pretty fancy mathematics going on behind the scenes to make that happen?
The problem is that every so often, we, especially when they get a large scale election like this one, correct. And apologies to those listening outside of America, you're probably already tired of hearing about our election. I don't know. I feel like they've been following it closer than some Americans. Yes, I agree. But oftentimes, when they get it right, you'll see a flood of reporting about, you know, the marvel that is this, these innovations in in polling strategies and analysis, until they get it wrong again. And then, every time they think they figured out how to really accurately do this. They find out they're wrong. So I think it goes without saying that the pollsters themselves, and everyone involved is aware of the challenges that they're up against. But the interesting thing is that they they choose to take on those challenges, and often seem to convince themselves that they've figured it out. And then we see this and say, Nope, back to the drawing board, again. In my mind, that raises a larger question of, do we really need all this much polling? If we can't do something, right, why do we do it at all? But what do we really gain from it?
Frank Butler 7:29
That also makes me think about the dark side of it too, right? How are they trying to use gear falling to an extent to maybe manipulate voters that think, Oh, it's not worth me going out, despite the the record breaking turnout in the election? You know, are they using it to maybe drive certain type of voter turnout, or what was really interesting, though, is that some of these lesser pollsters, saying that the main the major polling organizations are getting it wrong, because they're asking the wrong questions. One of the things that came up was that social desirability aspect of responding, right, so you create a bias, you know, if you think about it, right. I mean, in a lot of cases, the news, mainstream media has really latched on to the way President Trump talks about certain demographics. And there's a lot of stuff that goes on, right.
I mean, there's the adultery side, there's the perceived racial aspect of things. And you know, we don't want to get into politics in itself directly. But what they're saying is that basically, the mainstream media has been vilifying the president to the point that it probably creates its own sort of social desirability bias. If you ask me who I voted for, and I'm concerned about the perceptions, I might give you an answer that you're going to most likely want to hear. For example, I'm in Tennessee, it's a very, very red state. So let's say I did happen to go, Hey, I voted for Biden, but somebody comes to me, and I know that they're part of that Trump side of the equation, Republican side of the equation, and they go, Hey, who'd you vote for? I might actually go well, I
Paul Harvey 9:02
It was gonna be awkward.
Frank Butler 9:02
Yeah. For Trump just just because I want to avoid any conflict.
Paul Harvey 9:06
It's funny, because it's so opposite for me up here. And you know, purple ish, turning blue, New Hampshire, the secret Trump voter thing is huge up here. Nobody, but nobody, well, a few people, but very few people openly, have supported Trump and either in the last two elections, but you look at the statistics, you look at the votes. And, you know, close to half of New Hampshire voters vote for the guy. So that has a particularly unusual, I think dynamic to this. There's always going to be a little bit of what you're describing. But I think it's been very magnified with Trump that you're just not getting honest polling out of some people, many people
Frank Butler 9:42
That, you know, I think that's a big, big part of it. Now, what's interesting is some of these less reputable pollsters, and I don't mean that they're necessarily negative, they were actually closer to getting it right than some of the more prestigious ones. They actually said that that one of the keys is to not ask the person because of the social desirability bias, but rather just simply ask about how they think their neighbors are going to vote. And I was like, Man, that is a great way of passing the buck away from really trying to say, Hey, how are you going to vote to? How are they going to vote, right? So you're moving that social desirability bias away.
The other end of it, though, you don't necessarily know how others may or may not vote, you know, one of the big things they were saying is that the suburban housewife was going to flip and they did a little bit, but I don't think to the numbers that they quite expected. And, I mean, who knows why, and there's gonna be lots of dialogue out there about these kinds of things. And of course, then there was a surprise groups that the Latino community was not as strong for Biden as they had been in the past, you know, for Hillary Clinton, for example, there's gonna be a lot of these things going on a lot of these discussions. And I think it brings up a bigger question. And you've sort of already said, it's like, do we need to have all this kind of polling going on? But I also started to think about, you know, how do they go about polling, right, most of the time, it's, they're trying to call you. And I don't know about most people. But I know, for example, and Paul and I were talking about this, we both have our phone to silence any unknown callers. And I never pick up a number. I don't know, I just don't. So it could be very likely that that's part of is that the people who are picking up might not be a good representation of some of the characteristics or demographics. Heck, they could even give bad information. Oh, how old are you? Oh, I'm 85. But the reality is, I'm 40. Or, you know, maybe they say I'm 35. And they're really 80, or whatever, you know?
Paul Harvey 11:34
It's a known issue. But I don't know, if the general populace is aware of just how big a deal that is, you know, folks out there who are somewhat close to Frank and I and age, for example, both in our early 40s ish, we can remember a time when you had a phone in your house, it was a wired phone and didn't have caller ID. And you remember that phone ring, and it was like a bomb was gonna go off. If someone didn't catch that phone before the fifth or sixth ring, having no idea who is going to be in that era. Not that long ago, you could do polling this way, and get a reasonable number of responses, you know, you might get someone to dance a poster. But at that point, most people are going to go ahead and do the poll and get you off the phone. Nowadays. There's no way like we were talking before we started the show, who actually answers the phone calls from a pollster nowadays.
Frank Butler 12:33
I can't even tell you how many times I look at my phone and see Miss call on their silence call because it was from an unknown number. And I have it set up that way. And I'm like, oh, and of course, they don't leave a voicemail. I'm not going to answer. I mean, I give my personal phone number out to my students, for example, because I'm not typically in the office. So if they have an important emergency or something, you know, because we have deadlines with the simulation would have you, they can reach out to me and get a hold of me. Because I'm not in the office all the time, right. And so I tell him, I'm very clear. I'm like, you gotta leave a message. I was like, I won't answer right away. But I will answer as soon as I get your message. Because I know it's somebody that's in my class. I'm like, I don't answer unknown numbers. But if you go ahead and call, leave a message, it comes right through right away. So will return that call as soon as I -
Paul Harvey 13:17
So listeners we’ll put Frank's phone number in the show notes. And just make sure you leave a message when you call.
Frank Butler 13:28
Yeah and I’ll call you back, you know, if it's something that warrants me calling you back, the reality is that there's that element of it. Now, the other thing that was very interesting, that I saw mentioned somewhere was that the difference between an automated questioning, so like having sort of like a Siri talking to you or an Alexa talking to you versus an actual person, they said that the social desirability problem becomes worse when you're actually talking to a live person on the other end of that. And I thought that was fascinating. So basically, the social desirability problem gets worse when you have a person calling to do this whole thing, being like, you know, that person, person aspect versus having a robot, interestingly, do the conversation with you. And they said that it's actually worse when you do a person in person because, again, that kicks in that social desirability, now you're talking to somebody that you might actually influence how they feel, or they might perceive you more negatively, because you're like, Oh, I'm supporting one person over another or whatever.
Paul Harvey 14:24
You know, I did answer a poll phone call once I have to come clean. maybe eight years ago, whatever election was happening then, because I was a University of New Hampshire number. And so I figured it was something work related. But it was the UNH polling Institute. And at that point as well. All right, you got me but don't make this mistake again. But I definitely remember feeling it was a long pole to it may have been because it was getting on towards like the 1012 minute mark of this thing. We start the pollster and I were getting a little bit in For most things, but at one point, he asked about my opinion on something or other and whatever my opinion on that issue was, was not the majority opinion at the time in, at least in the state. And I remember almost being apologetic as I said it, and he said, Yeah, no, it's cool. It's cool. Like,
he wasn't judging me. But, you know, we were both kind of acknowledging that, you know, we were probably on different sides of whatever that issue was. So, yeah, I mean,
Frank Butler 15:29
You feel like you're being judged.
Paul Harvey 15:30
Yeah. So I see the robot ankle, that brings its own its own issues with. Who the hell wants to talk to a robot?
Frank Butler 15:36
Well, yeah, exactly. But at the same time, you know, you know, I know that if I'm talking to somebody, you know, that's not, for example, my wife or one of my close friends, that I'm willing to be more myself, I guess, in a way. And for those of you who don't know, me, which is probably most of you, I'm, I'm, I have no filter, typically. And I'm very opinionated. And I have strong opinions about certain things. And, you know, but you know, when I, that's only when I'm sort of in my little cocoon, right? When I get exposed to a broader sort of situation, or people, I tend to take a much more neutral sort of stance, in general, right? It's, I just, I guess I'm inherently non confrontational.
Paul Harvey 16:19
So you have a filter, you just leave it in the off position.
Frank Butler 16:25
So contextually, if I'm at work, and we're having a faculty meeting, it's not on, yeah, and I have no issues with saying what's on my mind, and, you know, probably, to a problematic level. You know.
Paul Harvey 16:39
You should see how much editing we have to do for these shows, guys. Every minute of Frank talking comes out of like, an hour of ranting.
Frank Butler 16:52
I have to redo all of what I say. We hire a voice actor, too. We have to start scripting my content.
Paul Harvey 16:57
I actually do want to not completely lose track of the polling thing. Because, you know, fair's fair, like, let's not just beat up the pollsters, because there's also those of us on the other side of the equation that consume this data and create a market demand for it. And I would like to separate this into like, kind of two categories, like just us average people out there. And the media outlets, I think, play an interesting role here. And I don't know about you, but I've noticed, when the polls get things, right, the media outlets really identify with those polls, like our New York Times poll predicted, or even if it's not them, just like another media publication, in the poll that they sponsored, did it at a, but when they get something really wrong, it's this poll run by blankety blank firm that you've never heard of, or Gallup or you know, 538. Never with the media partner name attached to it, or less frequently anyway. So that that's, that's probably It's its own thing.
But then there's those of us who have consumed that media and create this market demand for the polls. And, you know, I think we bear some of the responsibility in asking polls to do things that they really aren't designed to do. Now, the only people who often realize that are the actual statisticians and methodology folks who create these polls. If you actually talk to people at that level, they'll often say, Well, it's because that's not what we're trying to do. We're not creating a crystal ball here, we're just trying to get a read on the specific set of criteria. And you know, you'll often see news reports that say, you know, Biden is currently leading by 2%, in blabbity, blah, place, anything like that is by definition, nonsense, because any poll worth anything is going to measure and acknowledge its margin of error, which is going to be a minimum 3%, with this sort of polling situation that you're talking about with very small response rate, very small sample, trying to predict something very large. 3% is about as good as you can hope to get in terms of a confidence interval. So to say someone is leading by two points, that could mean that they’re down by 1%, you know, but one point, if it's a poll that's legit, it will have at least a 3% margin of error. So 2%, it's a tie, if anything, but they keep reporting things that way. And I think there's a reason for that, because we click on things that say so and so's leading by 2% or trailing by 2%. You know, we click on those articles, we tune into those newscasts. And I think a lot of it is that we want these polls to do things that they really can't do.
Frank Butler 19:38
Well, it's interesting, because, you know, I I know that a lot of polls I look at I always look at the confidence, or the margin of error. And you know, and I've noticed that most of them are like three to five or four to 6%. A lot of them I've seen four to 6%. And in what's really funny is that I mean, they're off by a factor of double digits right now. Right? And to me, that's, that's really interesting, largely Because that's telling you something, right, that's telling you that there's a lot of confounding issues. And I think a big one, it still comes back to the 3%, you know, response rate, you know, you got a sample size of 1000 people you think about that's 3%, is you've called what? A lot lot more to get that 1%. I mean, that 1000 people respond? Well, a lot of people, you've told a lot. So, you know, I'd have to do the math in my head, but all I know is that it's a huge number to get down to that being the factor of what 50,000 ish, 60,000 somewhere in there, just go with that, it always will frustrate me. But anyway, you know, it's one of those days, it's a big number, you talk to 1000 people. That's not when we talked about sample size and representing the sample size, that is not it. And I don't know where they're getting their margin of error calculations from. But I can guarantee you that those are way off based on that. 3%, which is another thing I think that they might be missing is properly calculating those confidence intervals based on their response rates?
Paul Harvey 21:12
Well, it's for a given set of assumptions. those are those are accurate for a given set of questions, when those many, many layers of assumptions fall apart, statisticians will tell you, well, then you can throw out the margin of error, because that wasn't the set of assumptions we were working with at the time, which is true. So we’ve got to continue adjusting, you know, now we have new data that will incorporate into our statistical models. It's all well and good. If you're not trying to use polls to legitimately predict future outcomes, which is what we keep trying to do with them.
Frank Butler 21:35
And you know, it's even more interesting to me just in general, there's been some other things that that just don't seem to align. Right. I mean, I think I think we're in a unique time in general, you know, California State of California tends to be very blue, right? I mean, and they've been historically that way, they tend to be, you know, overwhelmingly, especially Southern California, more left leaning, and, you know, that's their thing. And that's totally cool, and what have you. But what I found very interesting is that they voted against allowing Uber drivers and Lyft drivers to be considered employees, which goes sort of against some of those ideals. And again, I think that's one of the problems now with the assumptions, you have to look at this more fine grain data and say, Look, people are looking at this differently. And I think what was really something you mentioned, before we started recording this was that it's the way they're asking some of these questions, right? Here's a checklist, is the economy important to you? Number one, you know, late rank order. These are, they're generic, they're not, they're notreally getting into, really how people might be feeling about things. And I was really surprised, for example, that California did reject the proposition for allowing Lyft drivers and Uber drivers to be declared employees there. So that means that they would get benefits of that. But you know...
Paul Harvey 22:48
It's funny the way you say that, because when I think of that same proposition 22, or whatever it is, in my mind, the wording is they voted to allow employees to be or if they voted to allow Uber and Lyft, and everything, to categorize their employees, as whatever they call it, subcontractors -
Frank Butler 23:09
Subcontractors, contract workers, yeah,
Paul Harvey 23:11
- gig economy, whatever you want to call it. So even just that, it's just two different wordings of the same thing. Your Way kind of makes it less consistent with the blue state norms. Whereas the way I was thinking about it makes it seem not maybe not super consistent with blue state politics, but not inconsistent either. So imagine if you're a pollster, you've used just that small, little change in wording might have a really pretty big effect. If I went to someone in California said, Are you gonna vote on this proposition that will prevent people from being categorized as employees of Uber and Lyft? A lot of people would say, No, no, I don't like that. But if you went to those same people and said, Are you going to vote to allow Uber and Lyft to categorize their employees as whatever they want? person might say, Yeah, sure. So free world, let them do whatever they want to do.
Frank Butler 24:03
And I think I think that's the trick, right, is that people can interpret things if things are not done. Clearly. That's one of my big things that I try to tell my students is like, you know, don't obfuscate things, you know, especially when I'm working with companies. I'm like, you know, don't obfuscate your mission, because you want it to be unique. You don't have to do that be very clear, very precise. Choose your words carefully. So people understand. And there's no confusion. There's no way of misinterpreting things or interpreting things in slightly nuanced ways. Right. And I think that's what also happened with their affirmative action proposition that they had as well, which was rejected, you know, that California wanted to allow for affirmative action that had been banned, you know, a couple 20 years ago or something like that. And Endia, my wife and I, we were having a conversation about this, and we were both trying to figure out is this what is this trying to say? Like, I mean, we had seen some sort of message that the police in LA were against that proposition for this affirmative action thing, they say vote against it blah, blah, blah. And we were trying to figure out, are they being racist or not? You know, it was one of these like, because we didn't, we couldn't quite figure out what it was truly trying to say. I felt like it would be a good thing because it would help, you know, those who are disadvantaged, get opportunities and in general, disadvantaged groups. But at the same time, the way it was worded man, I could see why people would be like, I don't know about this one. Right. And I think that's something that we have to be so so so careful about. Oh, yeah.
Paul Harvey 25:27
I used to live in Massachusetts, when I was really young, at that state was notorious for doing that. And having these propositions that were worded in a way that I mean, you'd have to be Hitler to say no, or yes to something. But always, there's something snuck in there along the way, like proposition 22, to outlaw the the beating of stray dogs, and also making the drinking age 72. But that would be getting just buried in the middle of Yeah, yeah. So I mean, this is not an unknown thing that we're talking about. And people can use it to their advantage. And honestly, a legit pollster would know better. And I think most legit polls, they do have very fixed, precise wording for these very reasons. But we're back to talking about the polls, the phone calls that nobody answers. And so I think a lot of this ends up being supplemented by this less less structured, less scientific polls that are still structured and scientific, just with a little bit less concern for those things in order to get a bigger sample size and so on. And the public doesn't really know the difference. Just, “okay, this poll is being reported, that poll is being reported...” I don't know which one is different methodology, what the differences are and who cares?
Frank Butler 26:38
And are they taking it as gospel? I mean, that's the taking it as gospel. And I think in general, that's sort of the problem right now. And I think this is a this is another overarching issue that we're gonna have to deal with disinformation, on social media and such like that, it's how do we truly figure out what's correct or not? And the fact that we've put so much faith in these polls to tell us information, it's not just, it's really, right. It's the media that's been doing it, and we've been trying to consume that and saying, okay, you know, these are the media outlets I like to go to, they're continually telling me that this is the reputable source. And so I feel like I can trust that more. And the reality is that I can't, you know, if you're thinking about it, logically, you go, Well, I need to really be looking at a broader array of information to say, are they really accurate? If you see that even the less reputable sources are starting to show differences, but they've been more accurate in terms of predicting in the past? When do we start adjusting our, our direction of who do we go to there's really find the source of the truth. And I think that leads to a whole different set of stuff. And I don't,
Paul Harvey 27:42
I don't think we need to, and there's no good answer to that. Right.
Frank Butler 27:44
There's not and I think that's something though, that I think what's important out of that is just simply saying, look, question things, you know, it's okay to do that. And it's better to go and triangulate and try to make sure that you're also getting different sides of the same situation, you know, the end of the day, just because you don't like it doesn't mean it's wrong. And I think that's the other thing. And I've, you got to understand the other sides to make the best understanding of the situation. And that's really the challenge that a lot of people have. I saw a study the other day that said, 80% of the people are really only engaged with people who are like them in terms of their politics, right. So it's like, 80% of the people are really only talking to people who support their party.
Paul Harvey 28:25
You know, we hear this stuff about living in a echo chamber, but I hadn't really thought about it in terms of actual people in our lives that we talk with. And yet it's true. I started thinking, Okay, who do I talk to you that's completely on some other end of whatever political spectrum with me, and there are a few, but they're like, work friends, right, the people that I probably wouldn't know or talk with nearly as much if we didn't work together wouldn't -
Frank Butler 28:48
I get in, you don't get into the same detail that you might with a friend because you don't want to like anger, although I think it's maybe a little easier for somebody like you and me who tend to be a little bit more middle of the road. And yeah, you know, we can find common ground with more people.
Paul Harvey 28:59
Yeah. Because we don't have the team allegiance where it's like, “you violated sacred doctrine of the whatever party!”
Frank Butler 29:11
We also tend to, I think be a little bit more like, okay, we can recognize what they're saying, even if we disagree with it, and not necessarily be confrontational about it, too. I think that's another
Paul Harvey 29:24
Yeah, we just know that, you know, they're idiots. They're smart, but ya know,
Frank Butler 29:30
Like my colleagues, you know, he's much more he's very, he's very much more liberal, for example, and, and that's totally cool. And it's, he's sort of that person I do get into with on occasion, because some of the stuff I'm like, you know, it's, you're putting yourself in that echo chamber, in a sense, right? You're saying, Get you don't like it. And so therefore, it should be illegal. And the reality is that there's a whole contingency of people who do, they're supportive of it. And I think that's something that we all have to understand is that just because you don't like it doesn't mean it's wrong. And I think That's something that we're seeing a lot of issues with right now. And I think that happens in the polling too. I think we're missing the way that the polling is being done, I think is missing that nuance in some way. And you know, I'm not the expert to figure it out. If I, if I really put my mind to it, I'm sure, Paul and I could actually sit down and figure out a way of doing it and make lots of money doing it, but I've got a plan involves functional MRI machine.
Paul Harvey 30:20
Oh, I like it. Yeah, I could probably get one of the candidates, and you can get one of those fMRIs for under a million bucks. Now, it's something worth looking into showing a picture of the candidate, see which part of the brain lights up,
Frank Butler 30:33
I think we actually have the opportunity to do these, like cross, you know, University Grants here in Tennessee. So we have, you know, we have the Oak Ridge resort Research Center. And I think that would be something that I think they even have an MRI or something like that. So I bet we could do something.
Paul Harvey 30:47
I'm actually on a grant 27,000 author to acquire an fMRI machine. That kind of research is coming, I think, and it's exciting, because that gets arounds, not saying this is ever actually going to happen for political polling. But it does give us ways to get around these inherent weaknesses of the polling process.
Frank Butler 31:10
That's it biometric data, right? I mean, we've seen like, they're able to take people's moods, right, you could probably do the same exact thing. You know, especially as we get more and more like augmented reality kind of things, you know, like the augmented reality glasses, where you could be looking at some of the, they might not necessarily know, but you're face to face, or just simply say, Joe Biden, Donald Trump, and based on how they react to you to the name, you could probably guess who they voted for.
Paul Harvey 31:36
Just listen to these words, and do nothing. And we'll read your mind.
Frank Butler 31:40
There's probably a way to get to that, which would be pretty cool. Right? or using, you know, your your fingerprint sensor enough iPad or so I don't know, you know, at the end of the day, I think I think that's something that's interesting is that there's going to be a path forward,
Paul Harvey 31:52
It might be a terrifying,
Frank Butler 31:53
How do we get better at this? And how do we cut through the 3%?
Paul Harvey 31:58
It raises an interesting question, are these polls that important to us that we want computers reading our brains?
Frank Butler 32:06
Well, no, I think here's the problem. I think this is why it's important. I I generally tend to think most politicians are out of touch with the average person, right? I think they really are, I think, and I think this election was a perfect example. Right? You know, that the Cuban population down in Miami, for example, overwhelmingly voted for Trump, when they supported Hillary, right, in the last election. I think what we're missing is that nuance, and really what's important policy. And so I think this is why it's important that for the political process for both sides, right for both sides to get a better understanding of their constituencies. So that way, they can really understand what's important, but I think you have to get to answering asking the right questions and getting the right answers. Otherwise, you might be putting up policies that are really bad. Like, I think one of the things that hurt Joe Biden, for example, is saying, I want to increase taxes for people who earn more than $400,000. To me, it's like, you know, I'm never I will never be close to that number. But I still like to think about, I could get there, right? If I made a certain decision, and I pursued something, then I'd be really ticked off that I'm getting taxed. And I think the average American, we're set up to think, hey, there's a point we can make anything of ourselves. And by putting that number of saying, Hey, $400,000 I, you know, even if you make $100,000, or $40,000, I think 400,000 still seems attainable.
Paul Harvey 33:27
And it kind of stifles the dream a little bit. If you say no, that's for that's for someone else. That's not you, listeners, that that's for someone, those make big money. It's saying like, I'm never going to be that person makes the big right. Hmm. Sucks.
Frank Butler 33:40
That is so spot on. Right. And I think that's the problem is like for 1000s of number that people can conceptualize. 10 million is not right. You know, I think that's something that that has to be understood. It's like, you know, when I heard $400,000, I was like, you know, he will, Joe Biden's tax policies don't affect me, none at all. But I'm thinking, What if? What if, what if I got a great offer from corporate? It's like, Hey, will you come and help us turn around our business? You know, and we'll pay you $500,000 a year,
Paul Harvey 34:13
Or start some side business that takes off? Guy can dream, you know?
Frank Butler 34:16
And that that got me into thinking about more things to say, you know, the democrat platform, and I'm not picking on democrats solely I I'm just using this as an example. I think that had an impact is that you know, that I think one of the things that the Democratic Party has done too much is relied on these outlier examples of people who've been bankrupted and such because they didn't have health insurance using the outlier as example.
Paul Harvey 34:39
I think Bill Clinton was one of the first to really capitalize on that technique and is very good at it. But I think it's been so done to death now that like, okay, everyone's got their one and a billion sob story that you know, if that's true, it sucks or that person, but come on, we can't make national policy on that. So I think you're right. I think that that tactic is being overplayed.
Frank Butler 35:01
And I think what's interesting about that is that, you know, they can throw out this number like, you know, and this is something that I teach executives that session on strategic change. People don't care about the facts. Honestly, that doesn't change people's minds, right? You can say the 20 million people don't have health insurance. That number. I don't know, somebody Personally, I'm not feeling that I don't see I just I hear a number. I'm like, okay, so $20 million. How about that I don't mean to sound crass about I'm just using this as an example. It's just more than human nature, though, you know, like,
Paul Harvey 35:25
Human same reason why war footage from Sudan or something does not have the impact on a typical person living in America or Europe, as a terrorist bombing, somewhere nearby where they live, you can empathize and you want to relate, but it's human nature. It's it's not home until its own
Frank Butler 35:44
It’s “thoughts and prayers,” right. Thoughts and prayers. But you know, that to me is so interesting. And I started making me think more about things like clusters, right? The people who don't have health insurance are probably in these clusters, right? They're probably, you know, in certain clustering, so they know each other. But again, it goes back to that 80% of people only kind of affiliate with people their own political party. I think that's probably true in a lot of other ways. So, you know, I know people who are, for example, on Medicare or Medicaid, you know, either being poor or being old, but I don't think I've ever associated with anybody who's been in the middle there that doesn't have health insurance, or at least I'm not knowingly knowing about it. Right. And I've certainly never heard anybody say, oh, I've not been able to pay my medical bills. You know, and so I'm sitting there going, how big of a deal is
Paul Harvey 36:30
Lee almost did you and I are officemates. Remember back. When I didn't have health insurance, which which was common back then. And yeah, had a bit of a brain scare
Frank Butler 36:42
didn't require you to have health insurance, they required me to have it and I had to pay out of our measly friggin income.
Paul Harvey 36:48
Like, yeah, I got grandfathered into the previous system where you didn't want to pay us
Frank Butler 36:53
$1,000 more so that we could pay like $2,000 for our health insurance.
Paul Harvey 36:57
Now, I went all five years of grad school without health insurance. I was young and invincible. So it didn't, didn't affect me maybe as much as it should have until that moment where they were talking possibilities of like brain surgery without insurance.
Frank Butler 37:11
Because you end up having to pay most of the bill or that they write off a lot of it because of your income.
Paul Harvey 37:14
A lot of it was written off a headache, a CAT scan an MRI, which you know, together was probably -
Frank Butler 37:20
- that’s like $2500, $3,000, at least
Paul Harvey 37:23
Basically, I think they gave me the the rate that insurance companies pay, you know, they have like the you pay your What do you call that negotiate? Or whatever for you? Yeah, they charged me that.
Frank Butler 37:36
It stung, but the thing is, they knew that it can't bleed a turnip. Right. And I think what most people don't know, I had this conversation with somebody Exactly. You know, and they have a, you have a public payer system, they're in their single payer system, whatever. And I was like, you know, this is the thing that most people don't understand is like, if you're poor, or you don't have assets, they can't come after you and make you pay. And they know that. And so what they'll do is they'll send a patient advocate into right, a lot of it off. And which of course, affects those of us who do health health insurance, which means we pay more. Well,
Paul Harvey 38:04
I mean, that's kind of our version of health insurance, at least historically, it has been that those who can pay subsidize those who can’t.
Frank Butler 38:11
Well, that's what we have the Hippocratic oath to because they're saying that they can't turn away somebody, you know, anyway, going to the next level of the two I was thinking is like, and I'm thinking those people who do reject health insurance at their work settings, or they have their own business, and they don't get it. And they're making money, guess what, they are going to be paying out the wazoo for it, because of exactly what they did. And, and I'm like, I have less sympathy, sympathy for those people, because they're making money, they could certainly afford health insurance, but they chose not to, for whatever reason, they're young, they feel invincible,
Paul Harvey 38:40
Which is fine. I mean, insurance is largely, you know, it's a form of gambling. Anyone who's in the insurance industry will tell you that, how much risk Are you comfortable with? And you take on that much risk, and you pay someone else to take the rest?
Frank Butler 38:52
Exactly. And, you know, for me, I'm like, I'm totally, I totally believe that, you know, health insurance should be something that people should have. Right. I think they should, I think there should be ways of getting there. But you know, I'm not on the side of a single payer system. But, you know, anyway, you know, and that's it.
Paul Harvey 39:07
I'm not defending the status quo, either. Yeah,
Frank Butler 39:10
Yeah, exactly. But you know, I don't know these outliers. That's that nuance that they're missing? Right?
Paul Harvey 39:15
You had said something just a few minutes ago, that kind of brought a paradox. Oh, my gosh, to mind that when it comes to politics, and polling, and data, we tend to have these outlier examples that you're talking about. And then we have these big data sources that are inherently flawed in their own ways, as we've been discussing. So we've got lots of data in the form of polls, that's very imperfect. And then the complete other end, we've got very unfortunate outlier situations, using anecdotes to influence public opinion and so on. So we're completely like...void in the middle there. We're relying on like two extremes that are so far to the extreme that they're not really relevant, but all the data in between is passing us by.
Frank Butler 40:04
No, I agree. I think that is actually a great point. And I think that's what we might be capturing is the outlier data a lot in these polls lately. Yeah. And, uh, I guess folks, that's probably going to be the end of this one. Uh, you can hear Paul's daughter's home and I think we've pretty much covered everything on this. Again. Thank you for listening.
Paul Harvey 40:26
The Busyness Paradox is distributed by Paul Harvey and Frank Butler. Our theme music is adapted from It's Business Time by Jemaine Clement and Bret McKenzie. Our production manager is Justin Wuntaek. We hope you enjoyed this episode and we'd love to hear from you. Please send questions, comments, or ideas for future episode topics to input@businessparadox.com or find us on Twitter.
Also be sure to visit our website. Busy-ness paradox.com to read our blog posts and for links to the articles and other resources mentioned in today's show. Finally, please take a moment to rate and subscribe to our show on Apple podcasts, Spotify, iHeartRadio, Google podcasts. Or wherever you found this episode, .
[Music Outro]
Transcribed by https://otter.ai