Hello the Internet, and welcome to this episode of the Weekly Zeitgeist. These are some of our favorite segments from this week, all edited together into one NonStop infotainment la stravaganza. Yeah, So,
without further ado, here is the Weekly Zeitgeist. Miles. Yes, we are thrilled, fortunate, blessed to be joined in our third seat by an assistant professor of Technology, Operations and Statistics at NYU, where his research focuses on the intersection of machine learning and natural language processing, which, of course means you know his interests include conversational agents, hierarchical models, yes,
steep learning, spectral clustering. You guys are all probably finishing this sentence with spectral estimation of hidden Markov models, yes, obviously, and of course time series analysis. Please welcome, doctor Joel, said, Doc doctor Joel.
E Hi, thank you, Yeah, thank you Jack and Miles for having me. It's a pleasure to be here.
The pleasure to have you.
We're glad you're here because we got we got a lot of questions about a lot of stuff, some some to mostly to do with what your area of expertise is. Some things I just need a second opinion on, just around.
The career of Alan Iverson is it? Is it a bad joke? If it's been used in like a cable I believe that it's a Direct TV commercial is using They're like and you, we have our very own AI that helps you select shows. And then Alan Iversons sitting on the couch with the person. Wait, does that make it off limits? Yeah? Yeah, that's that's a real commercial. And I'm going to be running with that joke all day today because I'm not any better than the comedy
writers at Direct TV. I can never know how many it is. Yeah, but what doctor said, Doc? Is that good? Doctor said Doc? Is that a good way to address you?
Yeah?
Sure, I'm very easy about things. Okay, okay, how about doctor? Well, I've never been called that. I had some students in my class called me jay Wow.
Wow. Yes, yeah, is it joo so?
So the pronunciation is so, it's Brazilian Portuguese. That's what I'm named after, but I go by I go by Joao Jook jasying about it, doctor j even doctor j.
Amazing? All right? Well yeah, I mean the main thing We've brought you here today to ask the big question around AI, such as what is that? What is AI and we'll follow from there. But you know, questions that reveal that we're smart people. Obviously, Hey, Molly, was something for your search history that's revealing about who you are?
Well? Right now, it's uh Crossroads by bone.
Thuck bone b Yeah, you did point out that the actually the first lyrics of the song are bone bone bone, bone, bone, bone, bone, bone, bone, bone, bone.
Bone, bone, bone, bone bone. I'm also just reading about remembering about the video that there's a part where there's like a newborn baby that's died.
Yeah, and I don't know, people love when newborn babies die and their music videos and.
The reader takes them all to heaven.
Yeah.
Wait, isn't there one part where like someone touches the dude's eyes and no, they touched the forehead and the eyes go black and oh that's right, and that's how you know got it?
Yeah?
That to me, I was like, I don't know if I want God to get me like that? That seems pretty.
Look I'll be I'll be real. I won't pretend to have looked up something cool. The most recent thing I searched for was the pop star Tate McCrae, who has a viral TikTok song because I was.
Like, who is Tate McCray, please help?
She has a viral TikTok song. Her name is Tate McCray. Okay, and she's a Canadian pop star. And so the video is set in a hockey rink because it's about oh hell yeah, she's getting getting back at her hockey player boyfriend who just cheated on her.
Oh no, wow, I've never seen someone look more like their name.
Yeah, look more like a Canadian pop star named Tate McCray.
Yeah.
Like I was like, if you just showed me, like, who's this person, I'm like, that's Tate McCrae, Canadian pop star.
She's b Olivia Rodrigo.
Oh okay, but she just had a song.
That went really viral on TikTok, and so I was like a person. The song is called Greedy, and it's like she's.
The video looks like it is a Canadian spoof of Hit Me Baby One More Time.
It does look like that. It's really funny.
It's like, what if hit Me Baby One More Time but Canada?
Yeah, kind of, it's like if the Grimes video wasn't fascist. It's like, I like when there's like a like a sports.
Oh, like that's that first video she had like at the Monster Truck rally or whatever.
Yeah, where she was like I love Arenas.
Yo, Wait where did they shoot this video? Is she based in La.
Tate McCrae, Yeah, why recognize?
Yeah, yeah, this is where. This is where we used to play hockey and Bourbank I think is they shoot everything there?
Like Pickwick?
Yeah yeah, the Picwica shirts, the Picckwick Ice skating place.
I'd have to see if I just went.
There for the first time. Dang. Yeah, there was a big hockey boom in Los Angeles because of the Kings. Yeah, yeah, you know this, but children of the Valley loved to go play hockey in a climate where hockey has never been played.
Yeah, it's true. It's it's so true.
And uh who else I remember the other Fate Like I think I was like, honestly, I think it was one of the first black andies players to ever play hockey because this was like in the fucking eighties.
I was getting on the ice and then Kurt Russell and Goldie you.
Were like, you just improved the game one thousand percent.
It was we were such a ragtag group. It was like us, like some Armenian kids, some Canadian kids whose like parents are like trying.
To the platform. Mighty Ducks, by the way, yeah exactly exactly, the blazing group. This coach came back. He had just got one of them was girl. We had two Mexican kids, Jose and Joel. You know, we were the team. Photo looked fucking lit. I don't know what this was a movie. Yeah it was, and it was eighties, literally the Mighty Ducks.
But also it should be a movie about you and your ragtag group of valley friends.
Going to New England and getting realizer hockey.
Playing against the Rinch kids who have all the hockey gear.
Yeah right, exactly right Minnesota. They're just like mean Canadians, like just people people want to root against Canadian.
This video is funny because it's like, I like, I've never seen anyone do the aesthetics of hockey in a pop music video before, and it's like so Canadian obviously, but there's a part where she's wearing like a goalie glove. It's like she's dressed like sexy, but she's.
Wearing like pieces of hockey gear.
It's really funny. Yeah, it's like she's wearing like like a crop top and basketball shorts.
Kind of yeah, there's like this lone old like like Gordy Howe hockey glove. She's like holding that. I'm like, yo, this is so old. It's like okay, okay, like Jason Boorhees's hockey gear is a little Yeah, it's really funny.
It looks like like the Thanos glove or whatever too. It was just like it's like not sexy. And then she's driving a zamboni.
Jamie Jamie, Jamie Loftus's influence is felt across much of the pop music landscape.
Yeah, James, it's very Jamie loft is codded of Tate mcray.
It's kind of funny, like the last shot being heard driving a zamboni, like very like in a very weird like.
Yeah, it was like it's about being confident and sassy taking control.
It's pretty funny. It's funny, and it's like a good.
It's a pretty good song. I heard it on TikTok a million times and was like what is this And now I know.
It's Tate mc graan. Now our listeners know, and that is the sort of ship that they were not going to hear from us. I can tell you that, bunch. So this is a great search history you are bringing. I'm bringing pop corner. Well, it's just like usually the things. Okay, can I tell you the other thing I looked up? This is more on the type of thing I normally look up. But I found out that Well maybe whatever, I'll never mind. I'll tell you guys later. Oh I like that, I do. I want to make this.
Part go too long. I'll bring it up in the jet fuel part.
Okay, the jet fuel doesn't melt. Nine eleven was last week was going on what is What's something you think is overrated?
You know what I think is overrated is celebrity worship culture in America.
It is terrible. What is that? But we gave up on God, so we need new gods.
Live your own life though, right, It's like really, yeah, I don't know, I don't get it.
What aspect? I mean, what what spurred this? And you what did you observe? And you're like, we're doing this again. Huh, we're bowing at the prostrate at the feet of the Kardashian's.
The tiny bit personal too Well, we'll get into it later. But yeah, I mean, I don't I don't want to get into specifics because I don't want to name names.
But okay, it's for instance.
It's like, I'm an artist, right, and I'm very much an independent artist, and I do sometimes work in mainstream things, but a lot of my art that I create is very much independently done. I have friends who are like, oh my god, I can't come see your show. I said that because this weekend I'm going to see Adele and A pid a thousand dollars for my ticket. I love I love Dell, but I'm like, bitch, does not
need your thousand dollars. I need to put twenty dollars and bring five of your friends and come see my show. So it's like that kind of thing, and like, right now I'm dealing with this again. I have a future film coming out, and we just got told by the people were working on the marketing end of They're like, oh, well there's like two really big films coming out that weekend, guys.
So it's like, I come see yours, and I'm like, great, Like we need like more of this, like you know, glitzy like whatever.
Yeah, yeah, wait, does the American Nation come out October sixth, Is that what I read? Yeah, And we talked about We talked about the congestion of the calendar, the film calendar because Taylor Swift's movie comes out the thirteenth, and that caused like ripple effect with every other studio be like, all right, that scary movie we had, we can't do it on Friday the thirteenth anymore because that Taylor Swift is re entering.
That that's the second scariest date ten to six three.
So yeah, So that's another thing that I'm dealing with in my film right now. I lost theaters and screens because Taylor Swift is one of them. And then there's another really huge Bollywood film that's coming on.
I'm like, God, damn it.
That's also an audience South Asian, right, So yeah.
Just god, is it unique to America? Would you say the like celebrity worship? Is it more in America than other places that you've been.
Do It's so weird in America too, Like why are we obsessed with the queen and like the you know, the royalty in the UK, it's so weird, like people are having these parties to like watch marriages over I'm like what is happening. Yeah, yeah, it's like weird in America, but you know, I think the whole everybody does it everywhere, right, the whole world.
Yeah. Or people just getting so caught up because they share the same birthday as one of the royals. I think that's really just weird behavior for sure. You know, Miles is the same birthday as is the same birthday as Miles. I thank you, thank you so much, thank you so much.
But yeah, no, And I think, like, but it is everywhere too, Like I look, you know, I'm Japanese too, and I look in Japan, we have like it's called idle culture, Like we literally call these celebrities fucking idols, you know, and it's just this, you know, I think, why.
So much responsibility on these poor little people too?
I don't know.
Maybe some of them also just want to make art, guys, they just want to They just let them, you know what I mean.
Yeah, I feel like they don't mind it though they like they don't. It's like I think there's a trap where they like what they want it and pursue it with the single mindedness of Captain AHB. But then like once they get it, they're like oh my life is over. I don't have like a life anymore, right, or like.
I can't go to the store or every person I interact with I have to worry that they're trying to get something from me because of the status. But hey, it comes with a lot of money to rest your sad little head on.
Usually when you ask them, they're like, yeah, but I would not trade it for anything. Yeah, go back to being like you no, no, no, no, no no no. What is something probably that you take is underrated?
Okay, Like I said, I'm very bad at self care. I just started using I have two dogs and they shud a lot. I just started using a rumba. It's like, let the robots take over, all right, you're doing it better. Okay, this place is so much cleaner that I could ever sweep it up.
Just take my job.
I don't care.
I don't care. You can do it robots.
You know new rumba Like how long go? Did you get the room?
But so my brother gave it to me, so it's an old room, but I just started using it. So I'm like so excited.
I'm not going to say anything because it will disappoint you. So I'll just let you learn that on your own.
Wait, what's disappointing about it?
Just there's a lot of that I can't do that. It promises like that it can just like my father. No, really, it'll get tangled up and like if you have long hair at all, that shit, especially curly hair, will get tangled up in it. It'll not You'll have to keep cutting through.
Uh.
There's just a lot of upkeep. And then sometimes it's like I don't understand that there's a carpet step. I will keep knocking my little room behead against it.
Yeah, I got it for free and I had no expectations, and.
Oh, see that you're gonna it's awesome.
I believe.
I believe I want everyone to have that first week that you have the roomba where you're like, it's yeah, possible.
Everything is possible.
That's how I feel.
Yeah. And sometimes it will like start like listening in on your conversations, like you'll be like, wait, why is the room but in the corner of the room while we're like talking about or financial information, and why is it we.
Why why does it have those Boston Dynamic dog legs and climbing up the walls?
Where to get that gun. Yeah, my experience with the room, so I like, I have the same question in my heart the second you brought up the room. But I was like, oh, no, she doesn't know yet.
I don't know yet.
Yeah I saw, I saw the light leave.
Both of our eyes dragged too.
I thought you'd be so excited and I was like, Oh, this is so fun.
And then but every time someone talks about the room, but I'm like, what is it a new purchase? Because I think they will solve it. Like I think eventually it won't be bad like that, because like there are things that does that are pretty cool, and we've.
We've made artificial hearts and ship like you're telling you find the fucking room, but solution, Like I believe in us.
I love I love that we're talking about it like we did it like sports teams. Yeah, of course we made artificial hearts. Like, let's get on it, guys.
Go on. I mean, maybe the most advanced technology and robotics is not going to be purchasable at a bed bathroom beyond, Like maybe I was looking for it in the wrong place, But how.
Dare I say that about the bath and beyond? What the fuck do you think beyond is about.
Exactly That's what the beyond was.
The singularity.
Also feel like I don't care how far technology advances until we make a printer that works, you know what I mean, Like just print, Just print when I want you to print, and that I'll be excited about.
And that I also don't have to be like, oh, I now need to get like new ink or some shit.
Own or prime or highlight or whatever the fuck the preacher needs that my fight.
Also needs, you know, Like I know I spend that at FENTI I don't yeah it things for you.
I don't yeah zeigang for anybody who's like ever worked at like a you know, in the government or something like that. Do the powers that be secretly have printers that work? Like do they just have Like I feel like that there is a lot of like you know, designed obsolescence and like designed like fuck ups with printers so that like the ink costs more than the printer itself, and like that's by design. Bank Man, Big Inc. I
feel like they've been saying it. Somebody's probably figured out the printer, and they just like don't it's not profitable for them to sell a printer that is like, yeah, this lasts ten years and just works.
You know, I will say, speaking of like the artificial hearts and stuff. I have been in like academic settings with professors who are like Nobel Prize winners, and they too, who are creating these amazing technologies can only be foiled by printers and PowerPoint presentations. That is the only thing that can defeat a professor successfully that tracks, that tracks, honestly.
Yeah, all right, let's take a quick break and we'll come back and catch up on some news heading into the weekend. We'll be right back and we're back. Shall we talk about what do you want to talk about? You want to talk about private equity? This was this is interesting. I saw a headline that was allow me to pull up the exact headline because I was like, what is this? It said, can private equity be ellipses nice? And I was like, what the fuck is this?
And it's an interview by enslaved by Megan Greenwell where she interviews one of the heads of KKR, which is this like private equity firm that just bought the publisher Simon and Schuster and KKR is doing something that is kind of unheard of in the private equity world, they're offering employees of the companies they purchased an ownership.
Stake, and you're like, what the what, what do you mean? What the fuck is this?
They basically built in a deal where part of like the total equity of the company gets put aside for employees, and higher earning employees have the option to buy like additional equity if they want, but everyone gets a piece for free. And the idea here is that when the company sells or goes public, everybody gets a little taste of something, not just the C suite, which is like normal.
I mean, I feel like tech companies have been doing that with like stock like get allowing employees to like have stock and stuff like that for a while, but then when it comes time to like actually distribute it, they always like find a way to Like these things are tied to like forty page contracts, the employees aren't gonna have time to like read or like you know, have their legal teams like pour over. Like some of some of the examples is like a trucking company, Like
these truckers don't have fucking legal teams to like pour over. No, make sure they're not getting fucked over.
But like, so this one their golden goose. Of an example is this company called chi overhead Doors, where the private equity firm bought them. Everyone had their steak and when they sold, an average of one hundred and seventy five thousand was like distributed to the to the like rank and file employees. Some longtime truck drivers they say made as much as eight hundred thousand, and like even in speaking to them, like these truck drivers are like, yeah,
it's cool, Like I made the money. And also when you have a stake in it, you actually begin to see the inefficiencies in your own business. He's like, I used to get forty cents a mile no matter where I drove, and I didn't care because more miles meant more money. But when I realized that that could affect the sale price of the company, then we started actually
optimizing things. So it's like a very fucking like double edged sword here, because overall makes no sense given what private equity is about, and like they're bloodblust for just chopping down the workforce to just you know, lower in the name of lowering costs. But they say the private equit say it's win win it says workers get the chance for big payout in a voicing company decisions not really, and investors get increased staff engagement and retention, which in
turn creates higher profits. So the optimistic read on this is that it's an attempt to trying to make capitalism a little more worker friendly. The cynical version is that this is just corporate. This is like a corporate whitewashing scheme, and it's meant for people to be like, oh, this
is like this could actually benefit people. In the interview the guy from the firm, he's using a lot of maybees and might bees when it comes to this becoming like a huge payout for people, And it's all nice and like a hypothetical context, but the outcomes begin to differ based on what happens, even like when the equity firm exits the company, like do they go private? Do they sell it to another firm? And then what happens there?
Does that next firm even give a fuck? So I think that, you know, it seems like they're one example of like the truck drivers making eight hundred k, Like the ownership stake incentivized workers to just start cutting down on their own rather than the private equity firm coming in and doing it. And I think the other thing is that it's huge for retention, which seems to be a huge cost of when like a new ownership, you know team comes in, you're just fucking They're like, man,
fuck this place, I'm out of here. But not like what if we give you a piece and now they don't have to cycle through thousands of employees to retain and train them. So to me, it seems like a very elegant way of like cost savings while appearing to be doing the right thing because not everyone's gonna get paid out eight hundred thousand dollars.
Like one example, they sort of parade around. Yeah, I mean there are probably versions of capitalism that work right now there are none. No, there's not like there's probably like a handful of times that but like there are millions they stumble upon opportunities for it to work. It's called accidental socialism.
If right, yeah, socialism works, it doesn't work because how you have to explain somebody for it to exist, right, So somebody yet somebody has to get fucked for.
Oh yeah function. The other thing that there's.
No it existing without somebody getting fucked.
The like in this interview too, when like, you know, very pointedly, this journalist is asking like why don't you just raise wages like when you come in, like how about that to keep employees?
Like what what does that work? And then they use this I gotta like, we wouldn't be writing this article if we did that. Probably what they're.
I know, Megan Greenwell shout out Megan Greenwell.
Yeah is that who wrote this?
Yeah, and she understood, she's got she knows.
She's writing a book right now about how private equity Fox workers. Like, so she didn't go into this interview being like this is going to be great, Like it's there's a lot of really pointed questions.
But yeah, Kim Stanley Robinson like talks about how the like profit inherently, like profit is the only goal of any of this shit, and profit inherently is exploitive cause it's like getting more than your fair share. Yeah, that's the point of profit. That's what profit means.
So in this they asked like why don't you just like raise the salary, like like that's the easy way to go, And he says like, look, you know, we we manage all kinds of people's money. We including teachers, teachers retirement funds. We also manage wealthy people's money. So I don't want a cherry pick, but I'm just giving
you a flavor for why this is so tricky. If you're managing teachers pension money and you want to just raise everyone's salary that is on the backs of the teachers, which is not ethical and it's not our money, and that's like the fucking rationality used to just hide behind that. You be like, then the tea, that's the tea, it's the teachers.
We're actually doing it for the teachers of America.
Yeah, yeah, Yeah, Like like you come away from this interview is still kind of being like okay, like this seems not like a win win. It seems like a win maybe win if the environmental factors are precisely correct.
It's like what they're giving you a cut of a company that they're about to run into the ground.
Yeah, but they say if if you guys and figure out how to save money, even better, because then we don't have to be we don't have to just start doing layoffs.
Like what's happening in Hollywood right now too. It's like the thing that was happening in publishing where they run all the big companies, like your old school publishers like Simon and Schuster into the ground.
And then the company that just bought some Yeah okay, you're talking to the people who just bought them from. I think like it was like a carve out from Paramount or something. Yeah. So yeah, they know how to plan.
Their plan is to is to extract profits and get the fuck out of there. It's all and run all the way down.
Oh yeah, and it's wild too because when this guy says like we're not just here just purely for profits, I'm like, this is then you wouldn't be running a fucking private equity firm, right, Like so I love that, like the words come out this way. But yeah, it is a very it's an interesting, I think scheme from private equity to try and you know, save money in their own ways on employee retention while trying to incentivize people to like stick around as long as possible.
Before the inevitable.
Because then they say stuff like we wouldn't just sell it to like another company who like doesn't have the same ethos. It's like, really, if the fucking money is right, then then what right. I don't think that's the case.
This is funny that like the global corporate media will like find these examples where like if you look at it from a certain angle, it looks like, hey, this thing's working and it's really like helping out the world.
Yeah, anytime they're like, actually, check out this cool.
Thing, it's like, really, no, they're doing something different.
So I'm making March Simpson noises.
My stomach starts making March Simpson noises. All right, let's cheer yourselves up by talking about how it continues to just keep raining shit on Rudy Giuliani. We've been hearing for a while that he's broke, can't pay his legal bills, and now his lawyers, the people who are supposed to be representing him, are suing him for one point three
six million dollars in unpaid legal fees. He racked up fees and expenses totally more than one point five million dollars, but he only managed to pay two hundred and fourteen thousand dollars. And his defense is just like it's like really expensive, man, what the fuck cost so much? Are you serious?
Dude?
No way, no way, it's this high. Yeah, no, it is, by the way, so you know, he himself is a lawyer, and while he was representing Trump, he reportedly charged twenty thousand dollars a day. Okay, so you know he's I like that, he really is the way, Rudy Juliani says, it's a real shame when lawyers do things like this, And all I will say is that their bill is way in excess to anything approaching legitimate fees. Sure, sure, sure. The lawyer that he's stiffed who is now suing him
is one of his like oldest best friends. His friendship Giuliani dates back to the seventies when they were both prosecutors in the US Attorney's Office in the Southern District of New York. Wow, I wonder what the how that friendship started, Like where like if this his friend like Robert Costel and now is like look at you now, Rudy, look at you, Look at you. He got nothing, You got nothing.
I mean we've seen how recently it's been all hands on deck to try and get him more money, because it felt like he was going to Trump and be like I got bills, dude, and like you need to help me because like I know a lot of shit.
Uh.
And then recently there was a fucking one hundred thousand dollars a plate fundraising dinner for Rudy Giuliani that.
Trump posted specifically. So yeah, love, I have a feeling I wonder he probably was able to pay a lot of that off. But anyway, who knows, who knows? He's too busy.
Being Yeah, I like when the scammers get scammed, get yeah, taken in for their.
Scam their own, by their own, the scammers.
Scam each other, yeah, or you're just like none of these people have any loyalty to anyone but themselves. Of course they're all gonna like fuck each other over at the end and turn on each other. That to me seemed like the lynchpin of the whole Trump thing was like everyone he works everyone, he fucks everyone, but it's like everyone he works with is like somebody who would sell him out in a minute.
Yeah, and vice versa.
Yeah, like they're all that's the best part when they all turn on each other at the end.
Yeah, I'm waiting to see how the dominoes fall, because it's every time we hear about this stuff happening in like Georgia or like Florida. There's more people who are like, actually, they do not want to have the same defense team as Trump now and realize that like they were being like nudged to do what was the best for Trump and not for them to be free.
You know, you know how these people like sell themselves as like brilliant businessmen, and people are like, well, if he could make a billion dollars, like, surely he could run the country. Like, yeah, they're sc The thing that they all have in common that they're good at is the same skill displayed by if you've ever like been out to dinner with a group in like high school and everyone suddenly like, nah, I don't I don't have enough money to pay this, Like what are you talking about?
Like you know, you know, like arguing over the bill like that's before a homecoming dance. That is all that is their skill. They were just the most stubborn of the people who refused to pay for whatever they ordered and are like, no, that's you see right here on the menu it says twelve ninety nine, and so I put in twelve dollars and it's like, well we all also had drinks and no.
No, no, yeah, we all got those big Coca colas because we're in high school.
Yeah, exactly through that like weird brown plastic cup that was had the logo blast apps. You ordered the apps, you demand, We ordered the You.
Ordered the the mojo potatoes at Shakey's where we're having our dinner.
Drove someone here, so like that costs gas money? Oh man, Just we'll argue until you're too exhausted to have I remember we Molly, you'll appreciate this. Before a dance, we went to miss Ellie's right there across from the Gilain Maxwell in and out Michelli's. Yeah, yeah, yeah, yeah, but that's how we say. That's how we always say to school, We're going to miss Ellie's with the one seed.
That's why I would always be like, oh I say that, I say the accurate Italian yeah than me.
But so we went and I remember one guy, just one dude straight up just fucking just deny that they ordered the food that you're like.
The that's so funny.
It was like, you know what, yeah.
I would have to respect that because you're like friends. Well he's like, well where is it?
Then? Now shout out to my friends from you know what they do now, and we all, every single one of us was just like, nah, huh huh, I don't give a fuck. I'm not just gas grade apple.
Well it wasn't me that it was your idea. We came here because you said, yeah, it is not me. Well she's not, so we can't get the discount.
Have you been Michelle's No, Oh my god, it's amazing. There's one in Hollywood too. It's like the waiters also sing.
Yeah, oh hell yeah. Yeah.
It's like a piano bar, and your waiter will be like your waiters, like here's your food, and then they like turn around and take a microphone and sing memory from Cats.
Yeah.
Yeah, and the food fantastically subpar so mad, you're so mad.
Yeah, the experience, you know, when.
You walk in, you're like, this food is not tasting great, but it will taste like Italian food as we know it by the American definition.
I love those like mid Italian restaurants from the fifties.
M hmm. Yeah.
But it's also like they.
Have after somebody right.
Yeah, it's like a fake balcony inside.
Yeah, it's like two inches deep.
Oh we should go, yeah, take you guys there. Yeah, it's great because it's like a karaoke thing, but you don't have to do karaoke.
Is like.
Waters are just singing. It's so weird. I love it, very.
Sea, Miles, where you get your taste for the old country. Yeah, from Miss Ellis, as we would always incorrectly say in my elementary school. And we will. We just can't. We can't let that pronunciation go. That guy who was duck in the bill, he went on to have a successful like chain of daycare centers. Yeah, my god, profit king, you know, he knows how profit man. Yeah, somebody's got it.
Somebody's got to get shorted. In that case, it was you.
Yeah, exactly. And all right, let's take a quick break and come back and talk about something that we're all, every single one of us. Every If you're hearing my voice and you don't and you're not in the c suite at Lockheed Martin, you got shorted on this ship. We'll be right back and we're back. And Miles, I mean, we we talk a lot about how we what we think this future is going to look like, right.
Yeah, I mean I share the name with Miles Dyson him. I started a little thing called the sky Net, you know, that keeps me up all night.
Well, but seriousness, Hey he went out of here, miles, you know, I did. I did.
Look, it's all about altruism, you know at the end. But like I think for me personally, right, so much of my so much of my understanding of science is derived from film and television because I'm American and I think with like with Ai. I think whenever I would think of like, oh my god, it's like, you don't know where this thing's gonna go. The place I think it's gonna go is sky Net from Terminator.
I just see the shot from the sky of all the missiles tit being launched, like coming down. I'm trying to have a nice day at the park with my kids. I'm watching through a chain link, and next ladies over there shaking the chain link, skeletons.
Rattling the kids over there, get her out of here. But I'm you know, first of all, it's sort of twofold question, how how far off base is the idea of like how many jumps am I making in my brain to be like Chad GBT next stop sky Net? And and then also because of that, what are all of the ways that people like me are not considering
what those actual, real tangible effects will be of. And I don't want to be like the most cynical version, but you know, if we aren't careful and we have very sort of individualistic or profit minded sort of motivations to develop this kind of technology, like what does that worst case sort of look or not worst case? But what are the ways I'm actually not thinking of because I'm too busy thinking about T one thousands, right.
Like I just to like add on to that when when I think about the computer revolution, like for decades it was these people in San Francisco talking about how the future was going to be completely different and done on computers or something called the Internet, and we were just down here looking at a computer that was like green dots, like not that good of a screen display.
And by the time it like filtered down to us and the world does look totally different, it's like, well, shit, like that's that turns out that was a big deal. So yeah, I feel like I want to be constantly like on our show, just asking me a question like
where is this actually taking us? Because some in the past, I feel like it's been hard to predict and when it did come it wasn't exactly what the you know, tech oracles said it was going to be, and it was like unexpected and weird and beautiful and banal and weird ways. And so I'm curious to hear your thoughts on like, once this technology reaches the level of consumers, like of just your average ordinary person who's like not teaching machine learning at m YU, Like, what what is
it going to look like? But first but obviously first kind, but first guy, I.
Think it's kind of okay. So, so I think the idea of worrying about some version of you know, this technology going out of control, right is there's so many checks and balances of ways in which we are thinking about and the technology is sort of moving forward that kind of you know, the idea of something emergent and then not only as an emergent, but it's going to take over and try to destroy civilization, right.
And also send and also send cybernetic organisms into the past to ensure that those people do not grow up to take arms against kind of like John Connor and.
His yeah, yeah, So I'm a firm believer that in being able to send people to the future, But I still don't think that we're not going to be able to send people in the past. So that's that's maybe one problem.
But that's just like your opinion, man, all right, is going to happen. And by the way, i'm the Doc Brown character, I'm going to be friend a young child. Yeah right, exactly. Marty and Doc been friends like probably like it seems like eight years, so like when Marty was like nine years old, they became best friends. Anyways, sorry about that forgetting side tracks so well.
So I think, you know, so what I'm what I'm thinking about when we think about this kind of technology and how it could be, how the technology itself could go wrong. I don't think that that kind of integration is likely, right. I think that the concept of you know, the singularity, it's so far out, and I don't think that we have a good ability or understanding of like consciousness.
And I'd also don't think we have a really good understanding of like, Okay, you know, why would you know these large language models start, you know, trying to harm us. So there's all those steps and jumps and leaps and all the intermediate pieces that just seems so incredibly unlikely. Now, I know that a lot of the ai AGI people who worry, they worry is, oh, well, we got to worry about this tail race, right, and a human level extinction event is worth worrying about. You and some people
worrying about that is probably not a bad thing. But I just think it's very unlikely for the reasons of those of all the chain that would need to happen. And we're not very good at robots. Also, robots are you know, actually much further away.
But that is one Okay, well we'll talk about it, but put a pin in that because I want to come back to that, to bad robots, to bad robots, and how easy I just want to brag about how easy it would be for me to beat one up. But uh, all those dynamics, Yeah yeah, I got money on you in that fight.
But the thing that I do want to point out is that you know the concept of starting to build some of these rules into the large language models, you know, to try to say, okay, well, you know, try to be benign, don't do harm. That's a really super good idea, right, And I think that you know, even if you don't believe in sky Net, actually believing in trying to incorporate
responsibility into the large language models. I think that is something that's very important moving into some of the dangers though, Like I think actually large language models and people like Opening Eye was actually worried about this in one of its first iterations of GPT was the ability for the large language models to create misinformation and disinformation. And you know, I think that that's a really bad use of the technology and very very potentially harmful.
And it already seems to be what it's being used for. Like the technology, like the ways that like companies are trying to replace journalism with it, or you know, clickbait article like just generate tons of clickbait articles that are like targeted at people, is like it it feels like it's already training in that in a lot of ways.
Yeah, Unfortunately, I think you're right like that that there is this that there is this bad use of the technology where it's like, oh, let's make this so that
it could be as persuasive as possible. You know, there's obviously been a ton of research in marketing on like figuring out, Okay, how do we you know, position this product so that it's you know, you're more likely to buy it, right, and the same thing can be now applied to language of like, Okay, you know, for Jack O'Brien, how am I going to make this tailored advertisement just for him to be able to do to make him buy this particular product that I'm selling, or to make him not vote?
Right?
Yeah, right, you know those kind of I mean, I think that this is scary, and I think that this is in the now right, which you know, is something that in some sense is going to be hard to like really stop people from using this technology in that direction without things like legislation. You know, there's just some some components that need you know, more legislation, and I think that's going to be hard to do but probably necessary.
Right, Yeah.
I could even see, like just even in politics, you can be like, Okay, I need to actually figure out the best campaign plan for this very specific demographic that lives in this part of this state and for sure, and then just imagining what happens. But I guess is that also part of the slippery slope is that like the reliance sort of gives way to like sort of this like thing where it's like this actually is gonna
whatever this says is the solution to whatever problem. We have and like just kind of throwing our hands up and all just becoming totally reliant. I mean, i'd imagine that's also seems like a place where we could easily sort of slip into a problem where it's like, yeah, the chatbot may give us this answer.
But well, or or that people are starting to make like very homogeneous decisions and choices right where you would make, you know, many different choices because you already have a template that's been given to you by you know this AI. You're like, oh, okay, well, you know, I'm just going to follow you know, this choice or this decision that it's going to make for me and not you know, do one of the thousand different alternatives right, right, And
if you do that, I do that. Jack does that, right, Like all of a sudden, we would have done vastly different choices, But now we have this sort of weird, sort of centering effect where all of us are actually making much less, you know, very choices in our decision making, right,
which has I mean, it does possess real risk. I mean imagine applying this to resume screen right, where you can imagine the same type of problem, right, and those are there's just so many scenarios that you can think of where you know, we need to take care, right, Yeah, and this is a good time to think about that.
Right, And we're, yeah, we're turning our free will over to the care of algorithms and you know, the phones, like the skinner boxes in our hands that we're carrying around, and like that feels like a thing that's already happening. AI might just make it a little bit more effective and like quicker to respond and like, but yeah, it feels like a lot of the concerns over AI that makes the most sense to me are the ones that
are already happening. And yeah, just in like to kind of tip my head a little bit, like the in doing a lot of research on the kind of overall general intelligence concerns that we've been talking about, the ones where it like takes over and evades human control because it wants to you know, wants to defeat humans, I
was surprised how full of shit those seem to be. Like, for instance, the one story that got passed around a lot last year where a drug like during a military exercise and AI was being told not to like take out human targets by its human operator, and so the AI made the decision to kill the operator so that it could then just like go rogue and start killing whatever it wanted to. And that story, like I passed around a lot. I think we've even like referenced it
on this show, and it's not true. First of all, I think when it first got passed around, people were like, it actually killed someone man, and it was it was just a hypothetical exercise first of all, like just in the story like as it was being passed around. And second of all, it was then debunked, like the person who said that it happened in the exercise later came out and was like that actually, like that didn't happen.
But it seems like there is a real interest and real incentive on behalf of the people who are like set up to make money off of these ais to like make them seem like they have this like godlike reasoning power. There's this other story about where like they were testing the I think it was GPT four to
see like like during the alignment testing. Alignment testing is like trying to make sure that the AI's goals are aligned with humanities and like making humans more happy and they like ran a test to see where the GPT four was basically like, I can't solve the capture, but what I'm going to do is I'm going to reach out to a task rabbit and hire a task rabbit
to solve the capture for me. And the GPT four made up a li and said that he was visually impaired and that's why he needed the task grabbit to do that. And again it's like one of those stories like it feels creepy and it gives you goosebumps, and
again it's not true. It's like they were the GPT was being prompted by a human, Like it's very similar to the self driving car myth, like that that self driving car viral video that Elon Musk put out, where it was being prompted by a human and like pre programmed, and like that, there were all these ways that the human agency involved with the kind of clever task that we're worried about this thing having done. We're just like
taken out. We're just edited out so that they could tell a story where it seemed like the GPT, for which is like what powers chat GPT that like it was doing something evil, right, and it's like so it's I get how these stories get out there and become like because they're they're the version of AI that we've
been preparing for because like and watching Terminator. But it's surprising to me how much like Sam Altman, the head of Open AI, like just leans into that shit and is like he like he in an interview with like the New Yorker, he was like, Yeah, I like keep a Sinai capsule on me and like all these weapons and like a gas mask from the Israeli military in
case like an AI takeover happens. And it's like, what, like you, of all people should know that that is complete bullshit, but it and it totally like takes the eye off the ball, like how the actual dangers that AI poses, which is that it's going to just like flood the zone with shit, like it's going to keep making the Internet and phones like more and more unusable but also more and more like difficult to like tear
ourselves away from. Like I feel like we've already lost the alignment battle in the sense that like we've already lost any way in which our technology that is supposed to like serve us and make our lives better and enrich our happiness, like are doing that, Like they they
stop doing that a long time ago. Like that's why I always like say that the more interesting question, like with the questions that were being asked in philosophy classes in the early two thousands about like singularity and like suddenly this thing spins off and is we don't realize we're no longer being served and like we're like twenty steps behind, like that happened with capitalism a long time ago.
Like that we're like capitalism is so far beyond and it's like, you know, we're no longer serving ourselves and serving fellow humanity. We're like serving this idea of this market that is just you know, repeatedly making decisions to make itself more efficient, take the friction out of like our consumption behavior. And yeah, I think AI is going to definitely be a tool in that. But like that is the ultimate battle that we're already losing.
Yeah, I mean I completely agree with you on the on the front of like your phones are I mean, the company is are trying to maximize their profits, which mean you know, possibly you being on your phone at the disservice of doing something else, like going outdoors and
doing exercise right where you know, doing something social. I I do see on the other hand, that like you know, maybe hey, I can have positive effects right where you know, take tutoring for an example, or public health, where we take the phone and take the technology and use it for benefit, like providing information about public health by helping students who don't have the ability to have a tutor have a tutor, right, Like all of these kind of
things where let's take advantage of the fact that you know, the majority of the world has phones, right and try to use that technology for you know, a good positive societal benefit. But like any tool in it, it has usage that are both good and bad. And I do think, like, you know, there's gonna be a lot of uses of this technology that are not necessarily in the best interests of humanity or individual people.
Right.
So it's like really like sort of the real X factor is how the technology is being deployed and for what purpose, not that the technology in and of itself is like this runaway train that we're trying to rein in. It's that, yeah, if you do phone scams, you might come up with better you know, like both like voice models to get this. Rather than doing one call per hour, you can do a thousand calls in one hour running
the same script and same scam on people. Or to your point about how do you persuade Jack O'Brien to not vote or to buy X product? Then it's all going in that direction versus the things like how can we optimize like the ability of a person to learn if they're in an environment that typically isn't one where people have access to information or for the public health use.
And that's why, like I think in the end, because we always see like like, yeah, this thing could be used for good, and it's almost every example is like yeah, and it just made two guys three billion dollars in two cents, and that's what happened with that, And now we all don't know if videos we see on Instagram are real anymore.
Thanks. Yeah, Yeah, it's not. Yeah, it's not inherently a bad technology. It's that the system as is currently constituted, favors scams. Like that's why we have a scam artist as like our most recent president before this one and probably could be future president again, like because that is what our current system is designed for. And like it
is just scamming people for money. That's why when blockchain technology like for has its like infant baby steps, it immediately becomes a thing that people use to scam one another because that is the software of our current society. So yeah, there are like tons of amazing possibilities with AI. I don't think we find them in the United States,
like applying the AI tools to our current software. I would love for there to be a version of the future where AI is so smart and efficient that it changes that paradigm somehow and change it so that our software that our society runs on is no longer scams.
Yeah, right, I mean there is a future where you know, AI starts to do a lot of tasks for us. You know, I think that there's starting to be, you know, depending it's mostly in Europe now, but also here it talks about like universal basic income. It's it's kind of I don't think that that's going to be the case. I don't think that in the US anybody has an
appetite for that. Maybe in twenty thirty years, but you know, I do think that, you know, the there are certain things about my job right as a professor, about like writing and certain other components right where you know, here's some things where AI can help, right, And you know, today I use it as a tool to make me more efficient, and like, you know, okay, I want to
have an editor look at this. Okay, Well you know it's not going to do as good as a real editor, but maybe as a cop as a you know, bad copy editor. Yeah it's good, you know, Like but you know, I mean I've been writing papers for a while. You know, when the undergraduates write the paper, it does an even better job, right because they have more room for growth, right, And and so I think that, you know, yeah, I think that on a micro level, these tools are now
being used. I think that you know, Russian spambots have been most likely using large language model technology before chat GPT right, right, And so you know, I mean there are some ways that I think we're going in a positive and good direction, right.
Yeah.
Now, now it's like now that you say that, I'm thinking, I'm like we thought Cambridge Analytico is bad, and it's like what happens now when we like turn it up to one thousand with this kind of thing where it's like, yeah, here are all these voter files now like really figure out how to suppress a vote or get to sway people. And yeah, I again, it does feel like one of these things where in the right hands, right, we can create a world where people don't have to toil because
we're able to automate those things. But then the next question becomes, how do we off ramp our culture of greed and wealth hoarding and concentrating all of our wealth into a way to say, like, well, we actually need to spread this out that way everyone can have their needs met materially, because our economy kind of runs on its own in certain sectors. Obviously other ones need real like human labor and things like that. But yeah, that's where you begin to see like, Okay, that's the fork
in the road. Can we make that? Do we take the right path.
Or does it turn into you know, just like some very half asked version where like only a fraction of the people that need like universal basic income are receiving it, while you know, we read more articles about why we don't see Van Vocht in Hollywood anymore. As Jack pointed out that yeah.
Terribly worded thing about Vince Vaughan that AI could not get right.
He couldn't get his name right, So like Van Vought, the one. So the sci fi dystopia that seems like not not necessarily dystopia, but the sci fi future that seems like it's most close at hand, and like given you know, doing a weekend's worth research on like where where this is and like where all the top thinkers think we're headed. The thing that makes that seems the closest reality to me is the movie Her, Like Her
is kind of already here. There's already these applications that use chat GPT and you know GPT four two again, Like it really seems to me like and I don't necessarily mean this like in a dismissive way. This is probably you know, a good description of most jobs. Like that.
The thing that David Fincher I was like listening to a podcast where they talked about how David Fincher says, like words are only ever spoken in his movies by characters in order to lie, because like that's how humans actually use language, is just to like find different ways to like lie about who they are, what they're you know, approaches to things. And like I think, you know, these language models the thing that they're really good at, like whether it be like up to this point where people
are like, holy shit, this thing's alive. That's talking to me. Well it's not. And it's like just doing a really good approximation of that and like kind of fooling you a little bit, like take that to the ultimate extreme, and it's like it makes you think that you're in a like loving relationship with a partner, and like that's already how it's being used in some in some instances
to great effect. So like that that seems to be one way that I could see that being becoming more and more a thing where people are like, yeah, I don't have like human relationships anymore. I get like my emotional needs met by like her technology essentially is that What would you say about that? And what is there a different fictional future that you see close at hand?
So I think Her is great. I actually, I mean it's really funny because you know, I remember back in twenty seventeen, twenty eighteen where I was like, oh, her is like a great sort of this is where AI is going. And you know, for those of you who haven't seen the movie, it's a funny movie where you know, they have Scarlett Johansson is the voice of an AI
agent which is on the phone. And I think that this is actually a pretty accurate description of what we're going to have in the future, not necessarily the relationship part, but the fact that all have this personal assistant, and the personal assistant will see so many aspects of our lives, right our calendars, our you know, meetings, our phone calls, everything, and so it'll be you know, this assistant that we all have that's helping us to like make things more productive.
And as a function of the assistant to be effective, the assistant will probably want to have some connection with you, and that connection will be likely to allow you to trust it. And here's where the slippery slope comes from, you know, where you where you're like, oh, this understands me, and you start getting into a deeper relationship where you know, a lot of the fulfillment of a one on one connection can come with your smart assistant. And I do
think that there's a little bit of a danger here. Like, you know, again, I'm not a psychologist. I'm someone who studies AI machine learning, but I do actually, you know, study how well machines can make sense of emotion and empathy and really you know, GPT four, which is the current state of art technology is actually already really good and understanding. I use that phrase with a quote, the phrase understanding, but really what you were feeling, right, how
you're feeling under this scenario. And you can imagine that feeling heard is one of the most important parts of relationship, right, And if you're not feeling heard by somebody else and you're feeling heard by you know, your personal assistant, so that could shift relationships to the AI, which you know, could be fundamentally dangerous because it's an AI, not a real person, right.
Yeah, the one I was talking about is replica that app R E P L I KA. Where they are you know, designing these things explicitly to like fill in as like a romantic partner, and at least with some people. And it's always hard to tell, did they find like the three people who are using this to fill an actual hole in their lives or is it actually taking off as a technology, But yeah, the personal assistant thing
seems closer at hand than maybe people realized. Any other like kind of concrete changes that you think are coming to people's lives that they aren't ready for or haven't haven't really thought about or seen in another sci fi movie.
Sci Fi movies are remarkably good in.
Predictaous yeah, or maybe they have seen in a sci fi movie.
No, But I think you know another another thing, that one potential that I think not enough people are talking about is how much better video games are going to become. So people are already integrating GPT four into video games, and I think our video games are just going to be so much better because you know, you have this ability to like interact now with a character AI that's not just like very boring and chatting with you, but they can actually be truly entertaining and fully interactive.
Interesting.
I think, you know, computer games are going to be much much better and possibly also more addictive as a function of being much.
Better perfect, and then that will dull our appetite for revolution when our jobs are all taken by day. Yes, this is great, This is great, all right.
I feel much better after having had this conversation.
Dudes, grand that's auto is way better the kinds of conversations I have with people I would normally bludgeon on the street with my character something else.
All right, that's gonna do it for this week's weekly Zeitgeist. Please like and review the show. If you like, the show means the world de Miles. He he needs your validation, folks. I hope you're having a great weekend and I will talk to him Monday. By