Hello the Internet, and welcome to this episode of the Weekly Zeitgeist. These are some of our favorite segments from this week, all edited together into one NonStop infotainment laugh stravaganza. Yeah, So, without further ado, here is the Weekly Zeitgeist. Miles. It is a full it is we needed. We needed this for our system. We've been having all these damn experts on lately. We needed intelligent guests, thoughtful guests, intelligent thoughtful,
intelligent guest. We needed a pure chaos episode. We are thrilled to be joined in our thirty by comedian, a writer, an actor stand up albums with Blake album Stuffed Boy Live from the Pandemic All debut number one on iTunes.
Samson.
His album Twelve Years of Voicemails from tug Last to Blake Wexler charted on Billboard. Please welcome, the Hilarious, the Chaotic. He's riding a recumbent bike in short shorts and his plumpers are on full display.
It's Blake. This is Blake Wexler.
A k A.
I have a special, a stand up comedy special, but geist my peak, I have two plumpers.
What the hell am I doing here? I am a wexpert. I'm a wexpert, baby. This is i Heeart Media at its finest, Blake wexerm joined by Jack and Miles.
Thank you so much for having me.
Show now too.
Yes, I have data. We're gonna dive into the data. We're gonna talk to people on the ground and we are gonna say that they're long.
On that.
We have data that we're diving into. Today.
We're gonna bring up the electoral map here and I'm going to show you which you know groups are reporting as blake.
How are you doing?
Man?
This is the Electoral Community College. That's right, it's not it's a two year associates. I'm doing great, guys, Thank you so much for.
Electoral Online College.
Phoenix Phoenix exactly. This phoenix will not rise from the ashes. It has burnt itself to death. But yeah, no, I'm so psyched to be here as usual.
Great. You gotta start a little late.
We're busy integrating the Damian Lillard trade into our into our beings, into our personhood.
Didn't see that coming.
It's hard even gonna play for them, didn't he He only wanted to go to Miami, right, And now he's going to the Miami of the North Milwaukee milwa. Yeah.
I want a body of water up there, right, one of those one of those great lakes.
It's like, I'll be yeah that.
Yeah, He's like, you're not going to Miami. But the team does have M and I in the first two letters of this yeah, so you.
Got to read it. You got to read the whole word.
They paid attention to the opening sound and the ending sound of the word city that he was asking to be traded to, nothing in between. May me, I think this trade is really fun. I know this isn't our NBA podcast, but I think it's fun. It brings Damian Lillard, puts him on a team that gives him a real chance of winning a title. It makes Giannis relevant again. Jannis is probably our most fun and lovable celebrity basketball player at the moment. So I'm I'm.
Here for James Harden. I think James Harden is the most fun you know, la one.
I think we can take our self hating Sixers fandom off the off the mic, but my Miles suggested a good new new line for for us when people ask if we're Sixer fans, that we we only recognize one team of sixers, Miles, you wanna tell him what it is?
The January six Ers.
Yeah, that's the only that's the only sixers I recognize, and I'm not a fan of them, and I don't even know about another another group, Carmen. We do like to ask our guests, Yeah, what is something from your search history?
Well, I was.
Gonna say your first topic is a good one, but we'll go back to the no fault divorce. But I'm gonna tell you know what, And this is not even a plug or anything, but Carrie Washington's new memoir. She's just doing interviews and it just came out. And here's the thing. I had to google it because I'm reading everything about it. I haven't gotten the book yet. But she also, like me, found out that her biological father was not who she grew up with.
Oh oh well, and that.
Was kept from her whole life until she got booked on you know, Henry Lewis Kate's Junior skipped Junior's show on PBS.
Oh for real.
Yeah, and then I'm not telling you anything that's not out there. But then they contacted her or she was all excited and her parents were excited until they said, well, you need to do a DNA test, and then her parents were like, oh, excuse me, carry.
He's talk to you. Take a second, you have something to tell you. That is the premise of the show. So yeah, yeah, so.
And it's funny because she, I mean, this also happened to me as well, which was what I write about. But what she describes really really well, which I so appreciate because it's not necessarily something people would think about, is she, like me, grew up her whole lives knowing something was wrong, something was off, almost like feeling like something was wrong with our bodies, like we didn't things didn't match up and that and it was really uncomfortable.
Is very like a subconscious thing, and it's lots of anxiety and perfectionism, all this sort of stuff just happens and you don't even know why, right, and then this all this stuff comes out and you're like, oh, my parents have and keeping the truth from me my whole life, and you imagine living with them and then then they know and every time they look at you. So yeah, all to say, people, please talk to your children, tell
them the true, Please tell them the truth. But that was my last that was my last.
And Google search.
Are you like, how radical is the radical honesty with the kids? Are you like a no Santa Claus thing? Or are you like, how how far are we going? Are we saying like straight up? Look Santa Clauss because their parents are liars.
First of all, my child figured it out before I even had to say. She's like, she's that kind of kid. She's like, she's my kid. She was like, so I know who you are. I know who you are at hole, so you could stop it, but please keep writing it all my presence because it's cute. But I think that you know, like she she mentions as well, Carrie as like,
when the secret is another human being? Right, you gotta talk, right, you're creating Basically, I say to people it's like look, and I get asked this a lot when I give talks. It's like, well, well, what is it?
Is it?
Entitlement? Like, what is it that makes you feeling it? Look, a secret is yours as long as it's yours. But when you create another human being that is no longer yours, that is actually a person, so it's no longer a secret. It's a human And we can say as parents like well, you're my kid. Listen, kids do not please everyone.
Get this through your head.
Kids not belong to you. They are not belongings. They are people, so you must treat them as such. And they're separate humans and you need to treat them that way. And part of that is being very honest about how they came about and what's going on in your life. But I will tell you a funny thing is I don't my daughter and I we talk a lot, but one of my nieces said something about my first husband.
I can't remember how old my daughter was, but she was like maybe ten, and she was like, you've been married before.
Like she was like, you kept this from me, And.
I was like, I was like, girl, I just didn't think about it. Tell you, like, it was a long time ago, and it's so that's not a secret. It was just I just didn't think about it.
But it was. It was hilarious, right right. I think if like to your point, like, if a secret alters someone's total understanding of themselves and the world they live in, it's like, no, then that's not a secret. That's a potential like psychic bomb that you have to diffuse as quickly as possible.
Yeah, tremendous bomb and it affects your whole likely it literally affected me from the time I was a kid, like I remember it. It's definitely there. And you can't subconsciously carrying a secret like that and then living with your secret, right, Like what does that do to you and your relationship with that person? And my thing is like, look, you keep secrets in lines from people that you love.
That creates such a big distance that person can't be your real friend or your real like love or you're real whatever, like where it's just a huge distance that separates you from people you love.
So right, gosh, come clean. I have to have the courage to come clean. Doctor mcinnerney, what is something you think is overrated?
I feel like I'm not gonna win myself any friends will be like how to let lose friends and stop influencing people. But all the Disney live action movies, I've hit a stage where I'm like, no more live actions, Like The Little Mermaid is beautiful. I love the original Cinderella, but you know I don't. I don't want to see more and more of like the same movies again, Like it makes me a bit sad and like what other stories could be told if it weren't like always making
these live actions. But you know, maybe I'll be proven like totally wrong and the next one will be amazing.
But I'm ready for something else.
Just yeah. Based on everyone's first like sort of reaction, like oh, you saw how it was and you like, yeah it. No one was like it was so good, I'm gonna see it nine times. Everyone just kind of I think they're they have to reconcile like their love for the original source material with that and not fully be like I didn't like it. They're like yeah, I mean it's it's it's interesting. You know, It's like their voice just goes up.
Yeah.
I was like glad, I spent my afternoon going to that. I think that's right.
I've never been more confident in a prediction about the future of a like sub genre of movies than in saying that they're not going to suddenly figure something out about the Disney live action reskins of the animations, Like we we know what the original cartoons look like and what happens in them.
We know what is possible here, right.
I can't I can't imagine a version like what what they're gonna pull out where we're like, oh no, that is not did not see that one coming? Whoa that bear is talking and singing? Hold on, oh they already did that. That was the best one I think was Jungle Book and that was the first one they did, and ever since, I feel like, yeah, been diminishing returns man, although I didn't see The Little Mermaid, so I cannot speak to that one.
Is you? What's your favorite genre of film though, doctor carry Ooh?
I mean okay, so I really have a soft spot for like old school superhero movies. But my most recent most amazing film I film was the latest into the Spider Verse film, what was it? The Spider Verse? Yeah, and I just I love animation and that film. It's so witty and the visuals are.
Extraordinary, just great films.
So I oracle about how much I love these films at work and everyone has to deal with me being like, go.
Watch it, So anyone listening, go watch it.
Amazing favorite favorite Spider character Spider person in the universe.
Oh my goodness, I can't remember the name of it. But the Light really like really like dark noir runs for the first one. He's like from like all those like nineteen thirties kind of like right, mystery films, they've all in black and white.
I kind of love that.
Yeah, yeah, it was it just Spider Man, Noirs, Yeah, I think nor Spider Man.
Maybe I don't know, or I'm sorry Spider Man, nor Nor What is when you say old school superhero movies, do you mean like Christopher Reeve's Superman or are you talking about.
Like the first Iron Man? The First Iron Man.
I think it's almost like less like specific films.
I love, like almost films that have that like really cheesy like origin coming of age story, Like you know, for me, it's just really just about like they're discovering themselves and it's such a simple narrative, and like I feel like I should crave more complexity than that, and I should want it to be more nuanced.
Stories, but sometimes it's just really satisfying and he wants.
This clean cut narrative of this like good guy to beating all these bad guys when I want to relax, I really enjoy that. In my dage, I think about lots of like complexity and new wance and like what we do with the future of society, and so you know, right, sometimes I find it very relaxing just to be like souless.
Yeah, you're like, yeah, I love a Joseph Campbell type and flick. Just give me that hero's journey in everything.
Okay, Yeah, yeah, it's gonna.
Go wrong, challenge your old self to become your new self.
Yes, but Doc is an example of what we're facing with AI like in the immediate future.
I think we can all agree on that, right, Spiderman two. In a way, you're like, it's giving me the equivalent of like eight arms. In a way, you're like, oh boy, what is something doctor Higgins do you think is underrated? Underrated? Asking people for help?
And I know that, like I know that some I listened to the show daily, and I know people have such good responses. But that was the only thing that came to mind for me in terms of underrated. And so that's why I'm on the show. Come to my live show on October eleventh. Yeah, well the Saints come listen to us babbel about absolutely nothing but no seriously asking for help. And I think I'm the older I'm getting, you know, I think that there's this you know, and
again this could be the social justice in me. But I think, as you know, a marginalized person, we're always expected to have the answers to everything. We're always supposed to be the person like the world teaches you to be the smartest person in the room all the time, and I've just gotten to the point now where it's like, yes, I have a doctorate, but I'm still not the smartest
person in the room. Like wouldn't anything say like support me, bitch, Like that is literally how I feel all the time now, Like I'm like, if I need help, I'm anna ask. I'm not that girl that's like I've got it together, like I'm good, Like my life is great. No, my life is literally three second it's a string from falling apart.
So like, please help me, Like if you can help me in any way, whether it be like you know, giving me you know, a network or an email or a connect to something like ever below saying help me, right, because this life is hard, like just it's it's really really hard.
So oh yeah, and I think there's also, yeah, there's people don't want to want to struggle in public. That's a huge reason why people don't ask for help or act like they don't like they got their shit all figured out. But it is like the most liberating and sort of like life affirming thing when you ask for like you'll be surprised you ask for help, the help that you will receive from people that are like in
your circle. And I think that's also one of the underrated aspects of asking for it is like you will also realize like, damn, I go people on my corner. Yeah, I'd say this is probably the most underrated thing of my adult life. This is this is a great underrated It's something I've had to do for like addiction issues I've had to do in like it's made my marriage better, and just in terms of it being good advice. We did this thing like there's this hack, yeah, one hack
that Ben Franklin wants you to know. But there's this thing that Ben Franklin talks about that he realizes it's this like counterintuitive thing that if you ask someone for a favor.
They and they do the favor for you, they will like you more. And it's a weird because you're showing vulnerability and there's like something like subconsciously of them like doing something for you, like their brain is like, well, they must be cool then, because I just did this thing for them. So it's like it's all around, Like just being willing to be vulnerable enough to ask for help opens this world of possibilities.
And increase, you know, just everything.
It makes your life better and richer in so many ways, you know, decreases loneliness, which is such a huge problem, like right now, Yeah, and all that being said, I can speak about it for a long time and like I still don't do it enough.
Because it's hard.
It is hard, and I think I was gonna say real quick, I think a really important thing is, you know, masculinity is as a as a like a thing, right, whether you talk about toxic masculine or masculine as a whole, like we're all three men. And I think so being even even for me being non binary, there's still this presentation of I'm male presenting to people in some way, and so I'm supposed to never be emotional enough to
say I'm struggling. And I actually had to like acknowledge that a couple of like days ago, like two weeks ago. I was literally sitting in my office going like here comes the ideations, here comes to like no one loves you, no one cares like you're you're feeling very alone. So I had to hit my therapist back up and I had to say like, hey, I'm going I'm I'm literally slipping into this mindset again of being lonely and feeling like the world doesn't need me here, like I need help.
And she was like, you know, let me get you back, and you know, let me get you back on the roster. And so I think that that's something that like I try to emulate and everything that I do. That like just because you see you know the shows, you see me on the network, and you see me on TV, and you see me with you know, celebrities. I mean I went to dinner with the celeb friend of mine just this weekend. People are like, oh, that's so cool.
I'm like yeah, but we were sitting at the table literally almost crying with these other because we're all struggling, like we're all going through it. So yeah, it's just you know, asks for help literally, like people are not going to judge you for.
Asking for help. So yeah, and that it's okay to be struggling. I think that's the other hard part is like whenever I start getting like, you know, I start having I start ruminating on shit or whatever. It's like, man, you feel mine, you get your shit together whatever. It's like, no, yeah, that's not that's not how you get out of it. You get out of it by being like Okay, okay,
that's where we're at right now. Let's try and find a way to sort of pivot to something that feels a little bit better and then incrementally get out of there. But the whole like brute force of like I don't need to feel like this, damn it. Like it's yeah, guess what, it only makes shit worse. Yeah, what if you just said no to sadness? Though, if you just said no, like to drugs, Yeah, Like I don't know that. That seems just like yeah, it's just going to make
me do more more sadness. That's right, more sadness. All right, let's take a quick break and we'll come back and speaking of sadness, we'll talk about the Republican Party and how things are going for them. We'll be right back and we're back.
And doctor McNerney, as mentioned, I am an idiot on this AI stuff.
I think I generally have like my version of AI up.
Until last week. I guess like researching for our last Expert episode was what I had read in you know, mainstream articles that went viral and films move Hollywood films and then like messing around with open AI and or
chat GPT. So I had this kind of disconnect in my mind where it was like, from an outsider's perspective, we have this C plus level like copywriter thing with like in chat GPT, GPT for and then like the Godfather of AI, who I'm just trusting people is the Godfather of AI, but that's what everyone uses, that same phrase, like the Godfather of AI just quit Google and says we're all fucked in the next.
Couple of years. And I think it's confusing to.
Me because I don't know exactly, like I can't even like picture the way how he thinks we're fucked. And there was this letter that was like we need to pause development on AI for in the near future, and I guess I'm just curious to hear your perspective on that pause letter and what the kind of dangers of AI are in the near future.
Yeah, because to Jack's point two, also, we were talking with Juao Sadoc last week at NYU about it, and like at the end of it, we're like, Okay, so it's not sky Net right from Terminator and they're like, oh great, But then we realize there's a raft of other things that come along with just not being the Terminator. So yeah, that's I'm also from a similar perspective where I always assume sky Net.
Yeah, I mean, I think you're totally not alone in the dominance of ideas like sky Nee the Terminator, because so much of our cultural framework for understanding what AI is comes from a very narrow set of movie It's like the Terminator, like the Matrix, which always be this just AI is something that's going to dominate us. It's going to take over the world, and it's going to control us, and it's important to I think highlight that that's definitely not the only ideas that we have about AI.
We've got thousands of years of thinking about our relationship with intelligent machines, and there's a lot of different cultural traditions that have really different ways of thinking about our relationships with AI and intelligent machines that could be much more positive, much more harmonious. And so I do think our immediacy of jumping to this idea of sky net is reflective of very much where we are right now. Right,
I'm in the UK, you're in the US. These are countries that have a really long history of thinking about AI and very binary terms. So yes, I think it's important that we think about these long term risks of AI. And you mentioned the pause letter calling for a wholt to generating more large language models like chat GPT until it had a bit more of a moment to think
about to the long term consequences of these models. But I think it's really important not just to think of the long term risks, but to think about which long term risks we prioritize, because I think the sky net terminated fantasy eats up a lot of oxygen about how we talk about AI's risks. But there's a lot of different risks to AI poses. So another long term risk that we don't talk about very much at all is the I'm at cost of AI right because AI it's
hugely energy intensive. Data centers require a huge amount of water to function, and we have this massive problem of e waste produced by a lot of electronic systems. But that long term problem of climate crisis is much less exciting. It's really scary, it's really grounded and a lot of our experiences, and so it just doesn't seem to get
as much airtime. So that's something that I think is really important, is changing the conversation a bit to say, Okay, it's sometimes interesting, sometimes scary to think about the terminator option, but what are some of the other long term options that could really shape our lives.
Yeah, Like the degree to which the deck is being stacked towards the terminator option was surprising to me, Like we dug in last week a little bit to the two stories I had always heard that are like kind of put into the terminator version of AI taking over a category. There's the AI that like killed a person in a military exercise that decided to like eliminate its controller. And then there was the AI.
That hired a task rabbit to pass the CAPSHA test. And like in both cases those are like the AI that killed a person in the military exercise, Like that was somebody claiming that, and then when they went back they were like, oh, I was just saying it could hypothetically do that.
It was a thought experiment.
Yeah, it was a thought experiment of what an AI could do in the right circumstances, and the task Rabbit one was more similar to the self driving car Elon Musk thing where it was just there was a human prompting it to do that thing that seems creepy to us when we start thinking about, oh, it's like scheming to get loose and get overcome the things that we're
using to keep it hemmed in. So it does feel like there is an incentive structure set up for the people in charge of some of these major A companies to get us to believe that shit like to think to only focus on the AI v. Humanity, Like AI gets loose of its control of our controls for it and takes over and starts like killing people version of it.
I'm just curious, like what are your thoughts on, Like why why why are they incentivized to do that when it would seem like, well, you don't want to make it, you don't want it to seem like this self driving car will take over and.
Start kill your family.
Yeah, start killing your family. It's so powerful, But it seems like with AI, they're more willing to buy into that fantasy and like have that fantasy projected to people who are are not as closely tied to the ins and outs of the industry.
Yeah, I mean, I think that's such a point. I'm point because it's a weird thing about the AI.
Industry, right, Like you would never have this kind of hype wave around something like broccoli where you say, oh, the broccoli, if you eat it could kill you or it could like transform the world, And then you wouldn't expect that to somehow get people to buy a lot more broccoli and just be like, I don't want to eat broccoli.
Now that's so fucking good and powerful make you explode, like maybe like.
It even worse broccoli that you can then also eat, you know what that broccoli would be doing. But I do think that we see this real cultivation of hype around AI, and that a lot of firms explicitly use that to sell their products. It gets people interested in it because on the one hand, people are really scared about the long term and short term impacts of AI. On the other hand, it also scared then though, of
getting left behind. So you see more and more firms saying, well, now I've got to buy the latest generative AI tool so that I look really tech savvy and I look pick forward and I look futuristic, and so it's part of the bigger hype psyche, I think, to draw a lot of tension towards set products, but also to make
them seem like this really edgy, desirable thing. But I think what's also interesting about both the stories that you raise is when you looked under the hood, there was human labor involved, right, There were people who were really central.
To something that was attributed to an AI.
And I think that's a really common story we see with a lot of the type around AI.
Is often the way we tell those stories.
Eraises a very real human labor that drives those products, whether it's the artists who originally made the images that train the generative AI through today the labelers all sorts of people who are really central to those processes.
Right, And I know, like in your episode of The Good Robot when you're discussing the pause letter, you know, I think that the I think the version that we see as like sort of these short term threats at least and the most immediate way is like for me working in and around entertainment and people who work in advertising and seeing like an uptake in that section, I go, oh, okay,
that's easy. Like I can see how a company immediately goes, yeah, it's a tool, and then suddenly it's like and now you're on your ass because we'll just use the tool now, and we don't even need a person to prompt it or we need many We just need fewer people to operate it. So to me, I'm like, Okay, that's an obvious sort of thing I can see, like on the horizon.
And you did talk about, well, there was a lot of talk of these sort of long term existential or quote unquote existential threats that there were a lot of things in the short term that we're actually ignoring. What are those sort of things that we need to bring a little bit more awareness to, Like I know you mentioned the climate, and I look at it from my perspective, I see like the just massive job loss that could happen.
But what are sort of like the more short term things that kind of maybe are less sexy or interesting to the people who just want to write about killer terminators and things.
Yeah, I mean, I think less sexy is exactly the right phrase for this, which is a lot of the short term issues. I very much about entrenching existing forms of inequality and making them worse. That's often something people don't really want to hear about because they don't want to acknowledge its inequalities, or because it takes away from
the shiny units of AI. It makes it very much like a lot of other technological revolutions that we're already seen, and that's super boring, Like you don't want to hear about how the wheel somehow brought about some kind of inequality.
Are the big heart.
Take for today, But yeah, I mean something that I look at, for example, are like very mundane but important technologies, like technologies used in recruitment and hiring. So I look at AI powered video interview tools and look at how that affects people's particular you know, likelihood of being employed and how they go through the workforce, and yep, it's
less exciting seeming than the terminator. But again, when you look under the hood and dig into them, you're like, oh wow, this could actually really really compound inequalities that we see in the workforce under the guise of the idea that these tools are going to make hiring more fair, and that's a massive problem.
Right, So, because like the idea of those hiring tools like will actually take away these sort of like demographic cues that someone might use to like, you know, they apply their own biases to so in fact it is the most equitable way to hire. But is it because of just the kinds of people that are creating these sort of systems, because they tends to be a bit one note that that's inherently where like sort of that like it begins to wobble a bit.
It's a mixture.
So of course, yes, the lack of diversity in the AI industry is like very stark. It's also sadly in the uk an industry where for example, women's representation is actually getting worse, not better.
So that's a sad.
Slap in the face for a lot of the progress narrative that we want to see. But sometimes it's not even necessarily that the people creating these tools have bad intentions, maybe not even that they're using faulty data sets or biased data sets. These are two of the really big problems that are flagged, But sometimes the underlying idea behind their product is just fogus, Like it's just a really bad concept and yet somehow it gets brought to market
because of all this hype around AI. So with the video interview tools that we look at, for example, they basically claim that they can discern a candidates personality from their face and from their words, So how they talk, how they move, They can decide how extroverted you are, or how you know open you are, how neurotic you are, how conscientious you are, all these different markets of personality to which I would say, firstly, no, there's absolutely no
when AI can do that. This is just a very old kind of racial pseudoscience making its way back into the mainstream, saying Okay, we can totally guess your personality from your face, Like it's like your friends looking at someone's profile picture on like Pinder or whatever and being like they look like they'd be really fun at a party. Like it's about that level of.
Accuracy, you know.
And then second, is that even a good way of judging if someone's gonna be good for a job, Like how extroverted do you want a person in the job to be? Maybe in your job that's really really helpful. In my job, I don't know how help well. So there's just kind of a lot of laws at the very you know, bottom of these products that we should be worried about.
Just like a C minus level job hiring process. Like that's what I feel like so many of the things, like when you get down to them and see them in action, they're like not that good. Like it does feel like the whole thing is being hyped to a large degree. And like that's something I heard from somebody I know who like works in you know, like all of my friends who work in finance or like any of those things. Like my brain shuts off when he
starts talking about what he does. But he was saying, like he pays attention to the market, and he was saying, there's like a big thing propping up the stock market right now is AI. And it really is like that, that's where so much of people's wealth is tied up, is in like the stock market, and it's just tied to like what you can get people excited about a lot of cases. So it really like that from that perspective, the incentive structure makes sense.
Like you want people talking.
About how your AI technology can do all these amazing things because that literally makes you have more money than you would have if they knew the truth.
About your your product without being like yeah, wait, how many seven fingered Trump pictures can we create and right be like, yeah, man, fucking up millions into this.
Yeah yeah, I mean that's kind of something that's really come out of the last few years is how many fans just use the label of AI to get funding. Like I think there was a study a couple of years ago that said, like of European AI startups didn't use AI, which I point, You're like, well, what are you doing?
Well, we could though, this podcast is actually an AI podcast because eventually it could use AI. And before we were recording, Miles was actually putting in. He asked an AI to pitch him an episode of Friends in which the cast and you know, the people on the Friends on the show deal with the fallout from the events of nine to eleven. And it wouldn't do it. So we can't quite claim that we are an AI podcast yet, but.
But we did it. Did do it when I said, do it, pitch me an episode where Joey and Monica drive Uber right from Yeah, and it did so clearly it because you can see where these guardrails out. They're like, don't do nine to eleven stuff though, that's right now.
But I mean, so yeah, I think there's two things we're talking about here, like from from one perspective, like, yes, you could put it in the category of like, well, yes, the wheel makes racism or colonialism more Frictionless is a word that gets used a lot, but like literally in the case of the wheel, frictionless, but AI and like a lot of technology is designed to you make groups of people and like our interactions and the things that
make people money more frictionless. And that's something that you guys have talked about on recent episodes of Good Robot, Like there's this one example that really jumped out to me that I think was from your most recent episode, or at least the one that's up most recently right now as we're recording this, where you guys were talking about a company that asked a regulating body to make an exception to a law around like a high risk use of AI, and the law said that people had
to supervise the use of AI like just because it seemed dangerous, and the company appealed to the regulating body by saying, well, we just like that would cost too much and we would never be able to like scale this and make a profit. And it feels to me, like our answer as a civilization to that complaint needs to be that's not our problem, Like that's then you
shouldn't be doing it. But instead, it seems like the answer too often, not just in AI, but just across the board, especially in the US, is like, Okay, well we have to make an exception so that they can make a profit around this technology, or else the technology won't get developed because the only thing that drives technological
progress is like the profit motive. But that's you know, as I think you guys talked about in that episode, that's never been the best way to develop technology, Like it's it's been a good way sometimes to democratize existing technology, but like that's I don't know.
I feel like that.
Idea of you have to make it profitable, you have to make it easy on these companies to keep trying different things for AI to become profitable is baked in at a like cellular level at this point in how a lot of you know, Western colonial civilization.
Yeah, I mean, I think too often a lot of the technologies that shape our daily lives are made by a very narrow set of people who ultimately aren't beholden to us. They're beholden to their shareholders or to their boss and right, so they don't really have our best interests at heart, right, Like, for example, take this whole rebranding of like Twitter to X by Musk. I remember waking up and finding my little Twitter bird replaced with
this huge X and just being like ugh. Con firstly because it was part of you know, Twitter's low decline, but secondly it maybe you feel pretty disappointed or really aware of the fact that one guy could have such a huge impact on how literally millions of.
People use a social networking.
Platform that's actually super important, you know, to their daily lives and has played a huge role in activist movements and fostering different communities. And I think that's a story we see time and time again with some of these big tech companies, which is not only do they have their own profit motive at heart, they also they're not beholden in any way to the public, and they're not being compelled by relation to make good decisions that.
Necessarily benefit the public.
So I think a really important question going forward is how do we support kinds of technology development that are very much based in the communities that the technology is for. I think one really big part of that is recognizing that so many AI models, as you mentioned right, they're designed to be scalable and that's how they make money. This idea that you can apply them globally and universally, and I think that's a big problem, partly because it
often is really homogenizing. It can involve like explitting a model from the US usually out to the rest of the world.
It's probably not actually appropriate.
To using those contexts. But also a lot of really exciting and good uses of technology I think come from these really localized, specific, community based levels. So sometimes I think it can be about thinking smaller rather than bigger.
Yeah. Yeah, that was like another thing that struck me about like just all the warnings and even in that pause letter is sort of like the presumption that it's like, well, all you motherfuckers are going to use this, so we got to talk about it where it's like I don't know, I don't even fucking know what it is. Like a second ago, I thought it was skyingad and now like you know, you have your company being like, yeah, we now have enterprise AI tools like welcome. You're like, but
what am I huh like? And I think That's what's an really interesting thing about this as like a sort of technological advancement, is before people really understand what it is, there is like from the higher the powers that be sort of going into it being like, well, this is it? Like everyone's using it, but I'm still not sure how.
And I guess that probably feeds into this whole model of generating as much, you know, excitement market excitement about AI is by taking the angle of like everything's different because everyone is going to be using AI. Most of y'all don't know what that is, but get ready. And I think that's what also makes it very confusing for me, as just like a lay person outside of the text sphere, to just be like, wait, so are we all using it? And even now I really I still can't see what
that is and how that benefits me. And I think that's a big part of I'm sure your work too, or even like any ethicist, is to understand like, well, who does it? Ben Like, first, we're making this because it benefits who and how?
Yeah?
I think, yeah, I mean right now it benefits the companies that are making it. And it sort of feels like that's the way it's being presentved or slightly being like, yeah, you guys are gonna love this, but really it's we're going to benefit from the adoption of this technology.
Yeah, I mean, I think that's this crucial question. Is this stepping back and saying, actually, is this inevitable? And do we even want this in the first place? And I think that's what really frustrated me about the paut letter and about a number of kind of big tech figures signing on to it, is that they're very much pushing this narrative of like, oh, this is like unstoppable and it's inevitable and it's happening. We've got a fine
way to deal with it. And it's like you're making it like you're the people literally making these technologies and a lot of cases, so I do you really think it's an existential risk to humanity?
It's the fun of could even be that simple?
But that's you know what makes me really then question their motives and sort of coming forward with a lot of this kind of very doom and gloom language. I think it's also interesting if you look at, for example,
countries as national AI strategies. So if you look at say like China and the UK and the US and these countries that are now thinking about what their national long term AI strategy is going to be, they also very much frame it around the idea that AI is completely inevitable, that this is going to be the transformative technology for imaging the future, for geopolitical dominance, for economic supremacy.
And again, I think as an ethesis to what I really want people to do is step back and say, I think we're actually at a crossroad so we can decide whether or not we think these technologies are good for us, and whether they are sustainable, whether they are a useful long term thing for our societies, or actually whether the benefits of these technologies are going to be experienced by very few people and the costs are going to be borne by many.
Right, we talked last week about the scientific application that you know, used deep learning to figure out the shapes of proteins, the structures of proteins, and that could have some beneficial uses, will probably have some beneficial uses for you know, how we understand disease and medicine and how we treat that. But there are ways to probably differentiate and think about these things like it's not you don't just have to be either luddite or like AI pedal
to the floor. You know, let's just get out of the way of the big companies. You know, it feels like but it is such a complicated technology that I think there's going to be inherent cloudiness around how people understand it and also manufactured cloudiness because it is in the overall system's benefit. The overall system being like CAP, it's in their benefit to generate like market excitement where there shouldn't be any.
Basically, yeah, I mean, I think it's easy to generate this kind of nebulousness around AI because to some extent, we still don't really know.
What it is.
It's still more of a concept than anything else, because the term AI is like stretched and used to describe so many applications. Like I spent two years interviewing engineers and data scientists is a big tech firm, and they would sort of grumble, well, fifteen or twenty years ago, you know, we didn't even call this AI, and we were already doing it. It's just a decision tree, you know. Again,
it's kind of part of that branding. But also we have these again thousands of years of stories and thinking about what an intelligent machine is and that means we can get super invested in super cloudy very very quickly. And yeah, I don't want people to feel bad for being scared of being cloudy about these technologies, like it is dense and confusing, but at the same time, I do think that it is really important to come back
to that question of what does this do for us? So, you know, the question of like Luddism or being a lude I think is really interesting because I you know, personally I do use AI applications. There's certain things about these technologies that really excite me. But I'm really sympathetic to some of the kind of old school ludyet who weren't necessarily anti technology, but we're really against the kind of impacts that technology we're having on their society.
So the way that new technology is like I think it would be.
Things like spinning and weaving, we're causing mass unemployment and the kind of broader grammifications that was having for people in the UK socially, and.
That kind of has quite a scary parallel to.
Today in terms of thinking about maybe what AI will bring about for the rest of us who maybe aren't researchers in a lab but who maybe might be replaced by some of these algorithms in terms of our work and our output.
Yeah, can you talk at all about open source like models of because you know, when we talk about this idea that orations have all this power and are incentivized to do whatever is going to make the most money, which in a lot of cases is going to be
the thing that removes the friction from consumption decisions. And you know, just how people interact and do these things which well, as you guys talked about in your episode, like removing the friction, Like, friction can be really good sometimes sometimes your system needs friction to stop and correct itself and recognize when bad shit, when things are going wrong.
But you know, there's also a history in even in the US, where corporations are racing to get to a development and ultimately are beat by open source models of technological organizing around like getting a specific solution. Do you have any hope for like open source in the future of AI.
Yeah, I mean, I think I'm really interested in community forms of development, and I think open source is a really interesting example. I think we've seen other interesting examples around things like collective data labeling, and I think that these kinds of collective movements on the one hand, seem like a really exciting, community based alternative to the concentration of power in a very very narrow segment of tech companies.
On the other hand, though, I think community work is really hard work.
We had doctor David Adlani on our podcast, who's very important figure in Mascana, which is a grassroots organization that aims to bring some of the over four thousand African languages into natural language processing or NLP systems, and he talks a lot about how the work he does with Masakana is so valuable and so important, but it's also really really hard because when you're working in that kind of collective, decentralized environment, it can be much slower, and
as you said, there can be a lot more friction in that process. But counter to this move fast break things kind of culture, sometimes that friction can be really productive and it can help us slow down and think about, you know, the decisions that we're making very intentionally, rather than just kind of racing as fast as we can to make the next newest, shiniest product.
I was almost like, you're like in your work too, you know, you talk about how, you know, like looking at these technologies, especially through a lens of like feminism and intersectionality and you know, bipop communities and things like that, and I know, like broadly in science there's like, you know, there's an issue of like language hegemony in scientific research where if things aren't written in English a lot, sometimes studies just get fucking ignored because like I don't speak
Spanish or I can't read Chinese. Therefore I don't know if this research is being done, and therefore it just doesn't exist because the larger community is like we all just think in English, Like, so how do you like you know specifically because you know, when hearing the description of your work help me understand, like and the listeners too, like of how we should be looking at these things from that also from that perspective too, because I think
right now we're all caught up and like it's talking sky at and it's not you know, hold on, like there are other subtleties that actually we should really think deeply about because to your point, I feel like those are the dimensions of an emerging technology or trend or something that gets ignored because to your point, it's like
the thing. Well, of course it's unequal, of course it's racist or whatever, But what are those like, what are those ways that people need to really be thinking about this technology?
Yeah, I mean, I think English language hegemony is a really good example of this broader problem of the more subtle kinds of exclusions that get built into these technologies, because I think we've all probably seen the cases of AI systems that have been really horrifically and explicitly racist or really horrifically sexist, from you know, Tay the chatbot that started spouting horrific right wing racist propaganda and had get taken down through to Amazon hiring tool that's systemically
discriminated against female candidates. These are really I think avert depictions of the kinds of harms a I can do. But I think things like English language hegemony are also incredibly important for showing how existing kinds of exclusions and patterns of power get replicated in these tools. Because to an English language speaker, very crucially, they might use chat GPT and think this is great, this is what my
whole world looks like if they only speak English. Obvious to anyone who is not a native English speaker or who doesn't only speak English. It's going to be an incredibly different experience. And that's where I think we see the benefits of these tools being really unequally distributed. I think it's also important because there's such exclusions in which kinds of languages and forms of communication can get translated
into these systems. So, for example, I work with a linguists at the University of Newcastle and she talks about the fact that there's so many languages like signed languages and languages that don't have a written language, but they're never going to be translated into these tools and never are going to benefit from them. You might think, Okay, well, do these communities want those languages translated into an AI tool?
Maybe maybe not. I'd argue, of course that's up to them.
But those communities are still going to experience the negative effects of AI, like the climate cost of these tools, and so I think it's just really important, like you said, to think about what kinds of hegemony are getting further entrenched by AI powered technologies.
All right, great, let's take a quick break and we'll come back and.
Finish up with a few questions.
We'll be right back, and we're back and there is somehow still a thing having been born in nineteen eighty and like lived in like all my early memories are in a decade where it was just your brain on drugs, like just this weird, straightforward, just say noted drugs.
I thought drugs were a.
Thing people were going to make me do, like force into my blood stream when I.
Got to high school. Well that's only Seattle, Jack, If you drive through Seattle, that will happen to you. That will happen. They will force it actually through your car. They have through your car window.
Yeah, well you can't. You have to have your windows up. Come on, they're they're gonna blow heroin into you and you you have to have the air recycling in your air in your car.
Heroin gets in. But so, I I think we assumed that the DARE program went away, most of us, right until I saw I saw some people outside of a best Buy, like best by. Yeah did you see that? What you saw their people outside of the best Buy?
Yeah it was the best Buy. Yeah, there people stand outside of the best spot. I don't know if they have like like there's a contract, right, Yeah, but that's where I saw their people too.
I've seen their people outside two best buys before with like trying to get you to sign something, and all I said was like, y'all are what the fuck? I was like, Noah, and I'm here. I'm here for a mouse pad. Now leave me the fuck alone.
So yeah, yeah, I mean there there's obviously an epidemic of overdose deaths. Like the thing that Pewee Herman was saying in that ps A that we listened to up top is that was not true of crack, Like crack didn't have like an epidemic of od's, I don't think,
but it did. Like that that is happening with fentanyl, like ventanyl is causing a massive spike and overdoses like and you don't know if there's too much and people like that does seem to be like the thing that they told us about drugs in the eighties has absolutely come true.
It's called yeah, it might be, but one will happen and we will go fucking hog wild over this ship.
Yes, so they've decided, Okay, this is real, no more fucking around. Let's go back to that idea that was proven to not work and was just a way to get money to funnel money to cops, right essentially. Yeah, So the DARE program is being brought back to several school districts following a prolonged absence, this time with a specific focus on fentanyl. Doesn't seem like they've learned anything
from the first time, Like that. The thing that people say was wrong with the DARE program in the first place. First of all, that it was cops, cops going into classrooms, Yep, yeah that part. Second of all, they were doing it too early. They were targeting kids who weren't really they're trying to like get it in kids' brains before, like drugs were a real thing to them.
It's just like it. That's why it didn't work for me.
My drug was power Rangers at it, right, I don't like I have no idea what a crack pipe is, you know.
Yeah, totally or like I knew it from a movie. But like other than that, I was like, yeah, I mean I've seen seen New Jack City, but that's not that's not how I live. So yeah, because I think what I think, I was in fifth grade.
Yeah, same, I was probably in like fourth or fifth grade when they started pushing there down my throat.
And I just remember being like this is if anything, you're educating me on how like how drugs are. Yeah, that's all that was. That was like my first real conversation about drugs, where like the cop brought drugs into the classroom and we passed it around and smell that pipe. That's the worst smell you're ever gonna smell. I remember saying that. I was like, that smells good. Actually cigarette for failure, But yeah, yeah, it was.
I didn't realize it was created by U, the l a p D. Chief of Police Area Gate Baby, who his thoughts on recreational drug users is that they should be taken out and shot.
Mmm. So well, okay, a very like humane approach to Yeah, drug it waste before do Tarte. Yeah.
Tarte got his ship from Darrel, the Johnny apple Seed of the DARE program. But at its height, DARRE was being taught at a whopping seventy five percent of schools. By nineteen ninety eight, it was reportedly costing taxpayers six hundred million dollars a year. Yeah, and the presentations were just full of racist dog whistles, or just not dog whistles, just racism, like pointing to the broken black family as the source of the drug crisis.
Yeah, it's going to it for Tommy White School. Have ob be like Miles, you know about that? Yeah?
Yeah, does your parents your family? Yeah, you're like, I have no idea what this is.
I would see your mom picking you up.
Why yeah, whish dad, You're like, man, get out of my line. But the thing aways bugged me about Dare was like, it's like the gum theory. Do you I don't know if you all are familiar what I call it, the gum theory. It may not be real, but I think about it. It's like when you tell some Look, when you tell kids to not chew gum in school, they're gonna chew gum because you're.
Telling them underground gum racket.
Yeah, there was a whole like I was the girl that was bringing the gum to school to give to everybody.
Like what are you so like?
Dare was always like it was so dumb to me because I'm going, you're telling kids not to quote unquote do something, and all you're doing is you're just piquing their interest. But yes, it was very much riddled in anti blackness, and that is something that I've always had an issue with when it comes to this program's.
Yeah, you're paying the least cool, least trustworthy people in your community to come into school and talk to the k the part of your community that most despises them, right, and.
They put drugs on them. They put drugs on people. So how are you going to tell me? Yeah, I don't know. I've always had issues with DARE. I've always had she's with the police. But I will say very openly that I think DARE is just a way to make black people look worse than what the world already does.
So yeah, just reinforce like the police perspective or what drug use even, Yeah, it's it's black people mostly. And then also like getting kids like snitch on their parents and shit like that. It's like they had I remember them having a box that you put a non airs notes into the DARE box and then at the end of the class, the DARE officer would read the questions. And this was just a thing that smartasses used.
To like funny, make them read stupid shit. But yeah, apparently it ended like that that DARE box resulted in children like knarking on their parents and being removed from their own homes by social services because they revealed that their parents were taking drugs aka smoking a little bit of pot. So really cool problem solving there, creating fucking just situations.
But like to doctor John's point, like it it's like verifiable that DARE did fuck all to prevent kids from using drugs, right.
Yeah, yeah, yeah, the stats are officially in and it either had no effect or made children slightly more likely to use drugs and alcohol at an early age.
I'm I'm telling you it did because I legitimately wanted to smoke. PCP told us, Yeah, he said, I'm not joking. I'm hearing my best friend. I know, but this is my fucking tannel, like my loven year old brain. Right. First of all, the first thing we always ask him at the top of every day class was can we hold your gun? Yeah? He never and he was like no, there's no way, but we asked it every time. He's like, you guys have to stop asking well, can you pull
it out and show us the bullets? No, He's like we have to get going. He's like what is that? And then like we have kids who like new guns and he would always get annoyed. But so he's talked about this, like so we're gonna talk about PCP Angelush like that. I got this exact same speech. It's crazy. He said. He went to a call at a jack in the box where a guy had just beamed up, smoked some PCP and was fighting the police there and
they had they needed back up. He said. When he got there, this guy who smoked PCP lifted a fucking dumpster about the head. I remember that story and threw it out a cop car.
And me and my friends were all like, oh. We were like, yes, are you for powers? Yeah, We're like how does it give you? How does it give you strength? He's like, I'm not sure. It's like the drugs, the drugs duty It's like. But then like we're like, what is it?
Like you your muscles, like could you work Like I remember asking like if you worked out on PCP, would you keep that strength? And he was so again irritated by my curiosity around PCP induced superhuman strength. But yeah, he just got mad. He's like, God, this is not a cool thing. And and it's funny that everybody, I guess someone everybody heard this version of PCP gives you superhuman strength, to the point where, like YouTube came out. I was I was scouring YouTube for footage of somebody
on PCP with that kind of strength. Never found it, and I was like, this was this cop lying to me? And he was, oh, oh mm hmm, you believe that shit? Yes, cop life.
And so as the stats are coming out, the DARE program is doing their own research, you know, and they're like, all right, well, we're gonna get the DOJ to fund this study to see what the like, what actually happened, what's actually going on here?
Right?
And a month later, for the first time in memory, the DOJ refused to publish a study it had funded for the first time I ever, wow, because it found that the DARE program was useless and was taking money away from programs that actually worked behind the scenes. DARE went ballistic, and they just kept saying this one like every time they get they get attacked. The DARE program is like criticizing dares like kicking your mother or saying
that apple pie doesn't taste good. And then like a decade later they were criticized or like proven to be fraudulent again, and their response was, it's like kicking Santa Claus. To me, we're as pure as the driven snow so like, we're as pure as the Colombian cocaine we put in front of children. You're weekend Lumbian flake.
I just want to know where the money went, Like, there was a lot of money that was being moved in the nineties around there, and I'm just, well, ye know where.
It went right right right place?
Two corrupt cops. I'm wondering about where where, Like did it go into some like but none of benevolent police fund? I wonder or it was just like if you got that Dare gig, They're like, oh shit, man, you're off street street the street beat. Oh you're just doing fucking Dare classes. Oh that's right right there they were.
They were confiscating luxury cars like Porsches and BMW's and turning them into Dare vehicles, like with a Dare decal on it that they got to drive around. They were like, They're like, yeah, well it becomes pretty apparent who's winning here, though, you know what I'm saying, right, And then the kid's like, wow, the police department buys you guys cars like no, we can't afford that.
Drug dealers get rich enough to buy cars like this, And then you're like, oh, Okay, that's the wave. Yeah.
In reality, they were doing the same exact thing that drug dealers were doing. They were selling you something. It's yeah, it's ridiculous.
But amazingly Dare still uses the Dare box, like they still really have the write down questions for folks. It's anonymous, purely anonymous.
Psych We're actually going to investigate your parents if you say that you know what any of these things are?
Right that, uh, that's smell familiar to you? Oh yeah, and where to Like. I could totally see being like, oh, yeah, I know, I know what that smells. That's weed, and then being like, well, yeah, how do you know that? Uh? I my parents? Yeah, just looking panicking and throwing your parents under the bus. Yeah, I don't know. I don't know. I was at a reggae festival recently. That's why I know. Next question, no questions. I need a lawyer. Offer for
the questions. You're honored. Yeah, all right, that's gonna do it.
For this week's weekly Zeitgeist, Please like and review the show. If you like the show, uh means the world to Miles. He he needs your validation.
Folks. I hope you're having a great weekend.
And I will all team Monday Bye.