Park Mom Birthday Cake Drama, DeSantis Campaign Deep Fakes, Amazon’s Spy Ring, YouTube Disinfo Is Back, Facebook Still Awful, New Twitter CEO — NEWS ROUNDUP - podcast episode cover

Park Mom Birthday Cake Drama, DeSantis Campaign Deep Fakes, Amazon’s Spy Ring, YouTube Disinfo Is Back, Facebook Still Awful, New Twitter CEO — NEWS ROUNDUP

Jun 09, 20231 hr 5 min
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

This week we learned Amazon employees spied on Ring customers, YouTube is rolling back its policy about election disinfo, and Instagram is even worse than we thought (which was already very bad). In less terrible news, Twitter has a new CEO, and some low stakes drama between two Birthday Moms in the park reminds us why the Internet is great.

Good news! Biden names a point person to counter extremists trying to ban LGBTQ books: https://www.npr.org/2023/06/08/1180941627/biden-pride-month-book-bans

Bad News! Instagram Algorithms Boost and Connect Vast Network of Pedophiles: https://variety.com/2023/digital/news/instagram-pedophile-network-child-pornography-researchers-1235635743/ 

Scary news! DeSantis campaign uses AI-generated deep fake images of Trump and Fauci: https://arstechnica.com/tech-policy/2023/06/desantis-ad-uses-fake-ai-images-of-trump-hugging-and-kissing-fauci-experts-say/ 

The ‘Enshittification’ of TikTok: https://www.wired.com/story/tiktok-platforms-cory-doctorow/

The opposite of ‘Enshittification’ is ‘Embiggification’: https://youtu.be/FcxsgZxqnEg 

Twitter CEO Linda Yaccarino Is Teetering on the Glass Cliff: https://www.wired.com/story/twitter-linda-yaccarino-glass-cliff/ 

Get more bonus content ad-free and join the TANGOTI Discord chat at Patreon: https://www.patreon.com/tangoti 

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

There Are No Girls on the Internet. As a production of iHeartRadio and Unbossed Creative. I'm Bridget Tad and this is there are No Girls on the Internet. I'm here with my producer, Mike Hey.

Speaker 2

Mike Hey, Bridget.

Speaker 1

How about this smoke? Right?

Speaker 2

It's really something here in duc. It's like the only thing anybody's been able to talk about for days.

Speaker 1

I have not left my house in forty eight hours. And here's what you might have missed this week on the internet. You know that song that goes I always feel like somebody's watching me. Well, if you have a ring camera, that somebody watching might be an Amazon employee. And that's because before twenty seventeen, Amazon employees could just access whoever's ring camera they wanted, and some of them did.

This is according to a court filing from the Federal Trade Commission when in announced a five point eight million dollar settlement with Amazon over privacy violations. So this whole thing is about as creepy as you would expect. It sounds like there was just over broad access given to employees and also contractors to customers ring footage, and that there might have been just a generally lax attitude around

accessing that footage at Amazon. According to the filing, Ring gave every employee full access to every customer video before twenty seventeen and failed to patch bugs in the system that allowed hackers to access cameras and scare consumers. Now, I gotta say this was not just like a one time thing. It sounds like a pervasive issue. Ring employees

were spying on women. In twenty seventeen, an employee of Ring viewed videos made by at least eighty one female consumers and Ring employees using Ring products undetected by Ring. The employee continued spuying for months. He was looking specifically for cameras with labels that suggested they were in people's intimate spaces. So they had labels like master, bedroom camera or bathroom camera, so it's pretty clear what was going on.

In this instance, this employee was only caught after a supervisor noticed that the male employee was only viewing videos of quote pretty girls, which even that supervisor's explanation, I feel like, is pretty gross. In another incident, an employee gave someone's ex husband access to their Ring footage without their consent. So you can kind of get a sense

of how dangerous and serious. That could be like if you're going through a contentious divorce or you have an abusive X. The fact that Ring was just handing over your ring footage to your ex without your consent or knowledge is pretty problematic. So it was not just random women that these Ring employees didn't know. Ring employees were

also spying on each other. Another incident allegedly occurred in twenty eighteen when a male employee allegedly accessed a fellow female employees camera and watched her stored video recordings without her permission. And can you imagine, Like you go to work, there's Greg at the water cooler, cut to Greg watching you in your home. Like that is so fucked. So Ring employees were also gifting people with cameras and then

using those cameras to spy on them. An employee was found to have given Ring devices to people and then watched their videos without their knowledge, Like some gift. Oh here's the present, it's a Ring camera, I'll see you later. Totally fucked. Give the gift of Greg watching, lurking spying,

terrible gift. So separately, Amazon also had to pay twenty five million to settle allegations that it violated children's privacy rights when it failed to delete Alecxa recordings at the request of parents and kept them for longer than necessary.

Speaker 2

Wow. Wow, So parents requested did Amazon delete them and then they just didn't. That's an additional level.

Speaker 1

Yeah, My understanding is that they said that they would delete those recordings and then did not. So it kind of goes back to this idea of you know, I don't think that companies can just say one thing and do another. And it sounds like the FTC agrees.

Speaker 2

We talked about this last week with Google where they said they were going to delete users abortion data and then they just didn't.

Speaker 1

Exactly like it definitely paints a larger portrait of negligence. You know, this is all part of a push by the FTC to regulate big tech and hold companies accountable to putting profit above privacy. Now, I have said this probably a hundred times on this show. I get why someone would want a ring camera for their safety or to feel secure in their homes, but I would never

allow one of these devices in my home. You know, not that I would ever break any laws myself, but people should be aware that police can and have accessed your RING footage without your consent and without a warrant

because Ring partners with police departments. It sounds like even after having to pay out millions of dollars for these violations around spying and privacy, Amazon is still not admitting wrongdoing, saying in a statement that they disagree with how these instances were framed by the FTC, but they did, you know, pay these fines. And so it just goes to show

like these companies don't care. They don't care about your security, they don't care about your privacy, and they're callously praying on and fostering a climate of fear, fear around crime, fear around your neighbors, and tricking us into thinking that it is a safe choice to have their devices in your home, while all the while they are out here putting us in danger. You know, I completely get a lot of us feel vulnerable right now, feel scared right now.

I totally get it. But I believe that Amazon is inflaming and exploiting and taking advantage of this vulnerability that we might feel to sell us technology that ultimately harms us.

Speaker 2

Yeah, and like you mentioned, this story happens to be about Amazon, but it could just as well be a story about Google we talked about last week, or or Tesla, who a few months ago was in the news because their engineers similarly had unfettered access to the cameras inside people's cars, and they were using that to share videos of each other of like, hey, look at this guy

picking his nose in his car hahaha Lol. But you know, it could be Facebook, could be any of these big companies, because this is just the sort of thing that we get in surveillance capitalism, when we bring these devices into our homes, where privacy is just an afterthought and the commodification of our personal data is a gold rush for these companies, so much so that you know, a few million dollars in fines here and there is well worth

it to them. And so in this case, you know, it's not like the Amazon executives were getting in on the creeping. I assume it was just Greg and maybe some of these buddies.

Speaker 1

But I feel like I'm using Greg as a stand in for the creepy employee because I've have been binge washing session, so I'm like, oh, someone was doing something creepy, has to be Greg, just like the name on the top of my mind that I'm associating with like leccherous creeps. No offense to any Greg's out there listening. You're not all bad, No, just.

Speaker 2

The one who's a member of the disgusting Brothers. But yeah, they, I mean, yeah they, you know, Greg's the one creeping.

But it's the executives at the top that allowed it to happen with these lack of security controls when they I would are you had a responsibility to limit access in a way that would protect people's privacy, but they didn't, because why would they willingly put road bumps in the way of unfettered access to all of the data just to feed back into their profits and helping them make more sales, and without regulation, that's just going to keep happening.

Speaker 1

And I just don't think that what like thirty million dollars is an amount that a come any like Amazon is really going to feel. To me this I mean, I am happy that they're having to have some kind of a show of accountability to how they have wronged the public, But I don't think this is amount of money that they are going to feel. And just the fact that before twenty seventeen anybody a ring employee and their vast network of contractors could just look at anybody

from their ring cameras they wanted. Is just the scale of harm is so much bigger than the amount that they have to pay in punishment that I cannot imagine this is going to be something that gives them pause in terms of how they're harming people in the future. And you know you, I didn't even know about the fact that Tesla was using people's cameras to like share

jokes with each other. Think about the grossest, longest car trip that you've ever taken, where there's just Wendy's bags and used Kleenex on the ground and you're driving barefoot and you're disgusting. Imagine that video being used in Greg's slack channel just to have a laugh.

Speaker 2

Yeah, thanks a lot, Greg.

Speaker 1

We're really we're really beating up on the Gregs here. Let's let's let's switch gears. Let's beat up on the Ron's for a minute. Because Ron DeSantis's campaign just recently used AI generated images of Trump hugging and kissing Fauci in their campaign material. So we've said many times on the show about the spread of deep fakes, and on the Internet, and as AI image generation software gets better and more available, it is so easy for bad actors

to create fake images of celebrities and politicians. And I think especially when those images are weaponized in political campaigns during climates that are like politically fraught or sensitive or inflammatory. It's concerning because these fake images can often appear real. I am not too proud to admit that I got I got got by that image of the Pope in the Balenciaga puffer. It was early morning, I had had my coffee. I was just scrolling Twitter on my iPad.

I just saw it and I thought, Okay, the Pope's wearing a Balenciaga puffer. Checks out. It didn't. I didn't have My brain was not primed. Even though I'm somebody who studies disinformation and like talks about it for a living, my brain was not yet primed to have that seer of skepticism. So these fake images can get the best of us. Right, you shouldn't have to necessarily be so primed and poised and ready to tell the zoom in on the fingers and be like, oh, that's fake, to

tell if an image is real or not. So that I definitely am. I've gotten got before.

Speaker 2

Yeah, it's a good point you make that we shouldn't have to be constantly hyper vigilant. You know, if you if I think about the times that I'm looking at social media, it's when I'm waiting for the metro or bored or you know, waiting on this or that, or I don't know, I'm not like sitting down to critically engage in like deep and critical thought about what I'm looking at right, Like, I just want to see some cats. I want to see some jokes. I want to see

some stupid stuff that makes me laugh. And yeah, you see the pope in a puffer jacket. I don't know, I probably would have got got too.

Speaker 1

So this week, Ronda Santis's campaign joined in on the AI generated deep fake fun and shared a set of fake images of Donald Trump hugging and kissing Anthony Fauci.

They were kind of clever about it, mixing in the fake images with some real ones, but they weren't that clever about it because with a little scrutiny, the images appear to be obviously fake, Like in one of them, Trump's arm kind of wraps all the way around Fauci's body in a very unnatural way, like a long snake, like he's like got double the arm around him kind of in another one the letters in the phrase white house or garble, so you kind of know it's AI

or that something is going on there. Other details are also off, like the stars in the American flag, or unnatural details in Trump's hair, which somehow kind of manages to look more unnatural than his actual, honest to god human hair. I don't know how that's possible. So the French media entity AFP first reported on the fake photos, which were included in a video released by the official Dasantis campaign.

Speaker 2

You know, I really thought, in hope that we had more time before US presidential candidates were using deep fakes as part of their official campaigns. I don't know why I thought that. I guess maybe just wishful thinking. But you know, politicians like DeSantis and also Trump, like it happens to be that like DeSantis did this against Trump, but I could just as easily see it going the

other way. They have no qualms about speaking lies and spreading lies, so why wouldn't they take advantage of AI images to spread visual lies?

Speaker 1

And I think there's really something about visual lies that works on a different level than when you read something that's not true. The same way that I was just like, didn't have my class, hadn't had my coffee, scrolling ands all picture of the Pope and said, oh that's fine. Visual images can hit us so much differently. They can hit us quickly, more quickly than the written word, and they can sort of stoke a more emotionally resonant space

in us. I think, like in the interview that we did with the Foma Uzoma who was formally at pinterest, who was kind of in charge of getting medical misinformation and other you know, bad stuff, I'll have Pinterest, she said that one of the reasons why it was so prevalent on Pinterest is because it's a visual, a visual platform, and so there would be these highly visually visceral images, like when it came to abortion disinformation, it was these

like very visually charged images and they of course would stoke an emotional response on people very quickly. So definitely the threat of image based AI manipulated deep fake images is a real concern.

Speaker 2

Yeah, that's I think you really hit on something there that the like she said, the images evoke this visual emotion reaction. And also it's just new I think in certainly in my lifetime, and I think possibly all of human experience that we have to be so questioning of what we visually see in front of us.

Speaker 1

Right.

Speaker 2

We have all these phrases like he said, she said, you can't believe everything you read, and so we're used to having to like sort through what's true and what isn't in these like verbal domains. But we also have the phrase like you don't believe what's in front of your own eyes, right, which is usually a pejorative thing because typically in human experience, we can believe what we see for ourselves. When we directly see something, it's generally

assumed to be true. But even that is no longer true. And so, you know, of all the scary stuff that you've talked about on this show with our guests, I feel like deep fakes are one of the scariest because if people aren't able to tell what's real and what isn't, it does not bode well for solving the big problems that are coming our way. Right Like outside the windows right now, the city is blanketed in smoke from the worst wildfires on the East Coast that I can remember

in my lifetime, made worse by climate change. We're gonna have like serious problems that a society we need to come together to solve. And Yeah, if we can't have a shared understanding of what is real and what isn't, that's gonna be hard.

Speaker 3

Let's take a quick break at our back.

Speaker 1

Let's talk about YouTube. YouTube, which is owned by Google, will no longer remove videos falsely claiming that the twenty twenty presidential in the United States was stolen, reversing a policy put in place after the twenty twenty vote. In a blog post, the company says, quote, it will stop removing content that advances false claims that widespread fraud, errors, or a glitches accord in twenty twenty in other past US presidential elections because things have changed. Well what's changed?

Why are they doing this? According to YouTube, because free speech, they said, in the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence and other real world to harm.

Speaker 2

I just want to jump in there real quick to break down that quote that you just read, because it's so telling. They say, we find that while removing this content does curb some misinformation, it could have the unintended

effect of curtailing political speech. Right. So they're on the one hand, they're acknowledging that it does curb misinformation, it is actively today preventing harm from being done, and yet they are going to remove those protections out of potential maybe fear about something that could potentially occur in the future. And it's so backwards and silly and infuriating.

Speaker 1

Yeah, it's like the Gettysburg Address of cow towing tech pr bullshit speak where they say, we are we acknowledge that what we are doing will has prevented harm, but what if it could increase for speech? Question mark, let's find out, right. And I gotta say, like, I feel a little bit personally invested in this because getting Google to agree to take down this kind of misinformation is something that I personally worked on in my platform accountability work.

This has definitely been I've tried, I mean, I've tried not to dwell on it too much because what does that get you? But this has definitely been the year of a lot of policies that did verifiable, verifyable good and verifiably made platforms safer in like clear ways. This has been the year of seeing those policies be walked back, be changed. So it's a little personally distressing to me,

but that's either here nor there. They do say that they will continue to take down content that misleads voters about when, where, and how to vote, and claims that discourage voting, and content that encourages others to interfere with the democratic process. But knowing a little bit about this, like that's just not how it works, right. Donald Trump didn't go on television and say I think people should

go to polls and harass pole workers. I don't think he explicitly said that, but that was how that message was interpreted and filtered down right, And so this is really flies in the face with like what we know about how election disinformation and misinformation works. People don't go on TV and say do x y z in an

explicit way. It's very roundabout. It's very the implication. And so saying like, oh, well, can we definitely understand that misleading people about how to vote, content that encourages people to interfere with the democratic process. Saying that that is content that you will take down because you recognize that as harmful and dangerous. It kind of it doesn't make

sense that you would think that that is dangerous. But that is not how it really works, Right, Like, bad actors are really good at staying just on the right side of not needing to be have their content remove. That is like something that they are very good at.

It is their bread and butter. And so saying that someone will have to meet this very specific threshold to have their content removed by YouTube is just inviting people to make content that stays just a hair on that side of plausible deniability that essentially encourages just that you.

Speaker 2

Know, yeah, that's right, and it's a moving, shifting target, and to really meaningfully address it or moderate that content, to remove that sort of content that is sneakily just on the just the hair shy of being over the line. You know, it's a tough problem, and I can understand why platforms like Google really just want to punt on it, but it's also super important for our democracy and the cohesion of society.

Speaker 1

And also, let's just talk about this bs oh, it could threaten free speech literally. This morning, I saw a post from Trump on truth Social that said that the twenty twenty election was stolen. It's not as if there are no places in society or on the Internet that one could say those lies if they wanted to. Trump does it literally every day and has been for the last few years, right, And so what I am saying is that there is no reason for a company like

YouTube to allow that on their platform. Not being able to say it on YouTube doesn't mean that people aren't able to say it. People say it all the time, right. It's just like it's such an abdication of responsibility and leadership. And I should say that this is actually one of the last planks of like post insurrection digital infrastructure that was put in place after the violent insurrection. YouTube was actually the last one of the last platforms to have

a policy against twenty twenty election denial. Twitter started adding false labeling of false claims about the twenty twenty election early last year, but then they walked it back, saying that it had been more than a year since you know, Biden took office and there was an eventually an exchange of power. Facebook also walked back their labeling of false claims to.

Speaker 2

The decisions of these platforms about what sort of content they're going to allow and what sort of lying, disingenuous, dangerous content they might prohibit. That's not a free speech issue, right like that is what these That is a business decision by these companies of how they they want to operate and what role they want to play in a pluralistic society.

Speaker 1

And I think it really shows it's just like another plank and how these platforms are I guess I'll use the word preparing, but really it's the opposite of preparing for the upcoming election. Like YouTube restored Trump's account back in March so that users could quote hear equally from major national candidates. You know. I think it really shows that platforms and also media outlets like CNN, who platform Trump to just like lie and smear and slander people

for an hour. I think that these platforms have really learned nothing from the last few years. You know, verifiable lies about our democratic process and our elections. It's not discourse. It's not a difference of opinion. It's not an issue that you need to show both sides to or give credence to. It is a lie. It is a verifiable lie.

By now, platforms should have a game plan about how they're going to deal with verifiable lies about our democratic process in a way that is ethical and responsible, or at least more ethical and more responsible than just like let anybody lie, because that's what free speech is.

Speaker 2

Yeah, I mean, they just want that sweet, sweet engagement. They know that trumpy outrage drives clicks and eyeballs, and they want a piece of the action. You know, Google's business is selling ads, and integrity and responsibility really have nothing to do with it. And it's tragic and dangerous because YouTube is so powerful and it has such a large impact on our discourse, on our democracy that they should feel a sense of responsibility for the content that

they amplify and promote. But they've just demonstrated time and again that they don't, and that they're going to choose outraged engagement and selling ads every time. Platforms. Huh.

Speaker 1

Yeah, Let's talk about another platform that is generating a bit of outrage, and that is Reddit. So popular platform Reddit announce some changes this week and users are not happy, to say the least. So if you don't know what Reddit is, I love Reddit. It's kind of known as the front page of the Internet, to a point where, like, if you need to answer a question or you need some like real, like human vetted reliable information, it is a common practice to search whatever you need plus Reddit

to quickly get to an actual answer. I want to heart somebody say that, like, if they were in a fire, the last thing googled on their phone would be like how to fight fires plus Reddit, Like I died trying to get the answer from Reddit.

Speaker 2

So and Bridget, didn't you used to be like a power moderator on Reddit, like years ago, I was a.

Speaker 1

Former Reddit moderator. I was. I mean, it was a site that I spent a lot of my free labor moderating and caring about and thinking about. So if folks use Reddit, and they probably already know that, when you log onto Reddit from a computer, generally like it's fine, the user experience is fine. But if you log on on your phone the app, the Reddit app is very janky, and it's also full of ads, to the point that I would say that it's like, I won't say unusable,

but it's not a pleasant user experience. It's like almost unusable. The amount of ads that pop up when you're trying to use it. So to get around that, a lot of folks who use Reddit on a mobile use third party apps like Apollo, and these apps only work because Reddit allowed those third party apps to have access to

their API, basically have access to their back ends. And all of that is changing because now Reddit announce that they're going to start charging those third party apps a pretty high price for the access that they've always had.

You might remember that Twitter did something pretty recently too, so pretty much every third party app that relies on Twitter's API has announced that they will not be able to operate if Reddit continues with this plan to charge them the amount that they're asking to be charged to continue to have access to the back end.

Speaker 2

Wow, this is like such a surprising trend for platforms to make because I think I was reading like a Tom Friedman book a while ago. Tom Friedman, Yeah, it was a I don't remember why. I think it was like in an airport and it felt like the right

thing to do or something. But he was talking about like after the nineties and the dot com boom, how like APIs were the big thing that enabled the Internet to like take off and flourish because you had all this interoperability, and I think it was right about that. It was wrong about a lot of other stuff. But but yeah, like APIs were a great invention, right and

allowed platform arguably. You know, these all the third party apps that allow people to interact with Reddit in ways that are better than Reddit's native app makes available, help people use it, help grow their user base, and it's so surprising to see platforms like Reddit and Twitter clamping down on the very thing that helped them grow.

Speaker 1

Well. So, one of my favorite third party apps for Twitter block Party, which is a great tool to mass block people who are like harassers or bad actors on Twitter. They announce that they're taking an indefinite hiatus because of this change where they're now charging for Twitter access to

Twitter's API. Now, Reddit did walk this back a little bit, kind of along the lines of exactly what you were saying, saying that these new API terms would carve out an exception for accessibility apps which allow users who are blind or visually impaired to browse Reddit, because I guess Reddit does not have a way for people who are visually

impaired to browse it. So exactly what you were saying that these third party apps via API access are able to create a more inclusive experience to to include more users, which Reddit have not done. So really, like, I'm the expert on this, but in my book, it's like Reddit should really be paying the people who It's like, oh, well, you just completely shut out blind users. Luckily we have found a workaround to include them.

Speaker 2

You're welcome. Yeah, And that's a that is an interesting note here, right like, Reddit is pretty different than Twitter in a lot of ways. Right Like, Reddit, to my understanding, really survives and thrives on the active participation of its users U and I would imagine that some of them, particularly like you just mentioned, those who are blind who rely on these third party apps to use it, I would imagine some of those users might not love this change.

Speaker 1

While they certainly do not, and in fact, the Reddit community is very vocal and they do not take stuff like this without a fight, so they are organizing to fight back. The platform relies on an army of unpaid moderators. I used to be one of them, and honestly, like Reddit would not function without these moderators giving a lot of their free time and labor to keep Reddit subreddits, which are like they're like, you know, the different communities

on Reddit going. They all signed an open letter to Reddit saying that oftentimes third party tools offer better moderation tools than the ones that Reddit does. They say, many of us rely on third party apps to manage our communities effectively. The potential loss of these services due to the pricing change would significantly impact our ability to moderate efficiently, thus negatively affecting the experience for users in our communities

and for us as mods and users ourselves. It honestly sounds like these moderators are just generally fed up with what they feel as a lack of communication and transparency from Redder higher ups to Reddit moderators, who, again basically their unpaid labor is the reason why this site is

able to function. On June twelfth, much of the site's subreddits are planning a blackout, which means that individual subreddits will lock into private mode, meaning that and anybody who is not already a follower of these subredits will not be able to access them during that time or see

that content. So, if you don't use Reddit and you're thinking, who cares, what's the big deal, We're talking about millions of people, right Some of these big subreddits that are going dark in protests have upwards of like three or four million people on them, and I think it does kind of come back to what you were saying, Mike about how I mean. I don't want to give Thomas Freeman too much credit, but this idea of APIs and opening things up and having it be a more inclusive experience.

I really see this reddit thing and the reason that I wanted to talk about it here, I really say it as a part of a larger fight about the Internet and who the Internet is for and what it is for. Right There is this great piece in Wired about the inthitification of the Internet which kind of posits that this is how platforms die. First they are good to their users, then they abuse their users to make

things better for their business customers. Finally they abuse those business customers to claw back all the value for them themselves, and then they die. I totally agree with this, and I feel it everywhere, like Ie, not even just on social media platforms, just everywhere. I feel like everything online eventually has to whatever little joy or life or love it has to just bleed it to make a little bit more money for people who are already rich. You

know Instagram. Ten years ago, Instagram felt fun and cool, Like I remember when Instagram first was a thing I didn't. I wasn't on Instagram because I didn't feel cool enough. I was like, oh, there's nothing cool in my life to show. Everybody is like putting Sepia filters on their pictures, and it was like, it felt cool. It felt like a community where you wanted to be, And now it just feels like a virtual dead mall. You know, it's not enough that HBO, Max or Netflix cancels your favorite

series only after one season. Now they have to remove the entire back catalog so you cannot so it's not canceled, you can never watch the existing episodes again, just to save a little bit of money on residuals and like having to pay the writers and the actors more. You know.

On this new version of Twitter verification, people with something to sell can buy influence via paying eight dollars a month for a blue check mark to give them more influence to sell whatever it is that they sell more effectively. But where is the internet infrastructure that exists to prop things up other than building and squeezing capital from people?

Speaker 2

You know, Ooh, that is dark and heavy, but you're absolutely right. I do like this phrase and shitification. First them bigan, then they shitify. But yeah, it's a good question, like where is that internet infrastructure that exists? Yeah, well it's.

Speaker 1

A perfectly cromulent question. Luke Plummet, a tech writer, captured this feeling so well on Twitter. He writes, a huge part of the bad vibes you feel everywhere these days is because every single aspect of our lives is being squeezed by companies shaking us down, and we're reaching a breaking point. Every platform, every service, every product, every form of entertainment just keeps tightening the grip. Each quarterly growth report looks good in isolation, but we're reaching the point

of collective societal exhaustion. And boy, I really felt that, I really identified with that. I think I'm there where it feels like every little bit of joy is being squeezed and mined for whatever little capital can be taken from it to make some rich guy who's already rich

a little bit more rich. And I think I'm really yearning for online spaces that felt like they felt when I was a bit younger, you know, that felt like connection and community building and joy and it feels like we are seating more and more of the Internet solely to people who want to make money, right, whether it's capital for wealthy investors or people who want to sell you something or outright scams.

Speaker 2

Yeah, we got to bring the weird, joyful internet.

Speaker 1

Yeah, I want to see Like I don't want to date myself, but I want to see more weird Internet spaces that are just like, here's my weird cartoon website. Here's my big spoon or my salad fingers or my teen girls squad. All those are real things. Look them up, kids, They're very funny.

Speaker 2

Yeah, there was like funny stuff. There is joyful stuff. I feel like it's out there too, right, like they're instill the infrastructure to create those things. Maybe the problem is that we're all spending our time on the big platforms because everybody else is there. Maybe we gotta get off. I don't know. I'm just like speculating and losing the thread here.

Speaker 1

So that's I mean, it's pretty messed up in dark and I have to we have to talk about this, like speaking of dark, messed up stuff online.

Speaker 2

I want to talk about this.

Speaker 1

I know I don't want to spend too much time on it because it like genuinely nauseates me. But we have to talk about this new Wall Street Journal report about Instagram. A trigger warning here about abuse of children. It is horrific. The Wall Street Journal just published an expose showing that Instagram has been algorithmically promoting a vast

trade of child sexual exploitation material. The journal said that it conducted an investigation into child sexual exploitation material on Instagram in collaboration with researchers at Stanford and the University of Massachusetts Amherst. They found that pedophiles have long used the Internet, but unlike forums and file transfer services that cater to people who have interest and illicit content, Instagram does not merely host these activities. Its algorithms promote them.

Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests. Here's how it all works. The researchers found that Instagram enabled people to search explicit hashtags and then connected them to accounts that use those same terms to advertise child

sex material for sale. According to the journal, test accounts were set up by researchers that viewed a single such account, and then we're immediately hit with suggested for you recommendations of purported child sex content sellers and buyers, as well as accounts that we're linking to off platform content trading sites. Following just a handful of these recommendations was enough to flood their test account with content that sexualizes children. In

addition to that, it's horrifying. In addition to that, certain Instagram accounts invite buyers to commission specific acts, with men us listing prices for videos of children harming themselves or imagery of minor children performing sex acts with animals. This is all, according to the Wall Street Journals report, citing the findings of the Stanford Internet Observatory researchers. At the right price children are available for in person meetups. The

journal reported, h Yeah, it's stomach churning, it's horrifying. It's terrifying. But I have to say it is not terribly surprising to me, especially right now when we're having so many conversations about protecting kids online, and if lawmakers were serious about that, they would be regulating Facebook and companies that Facebook runs like Instagram, because according to the data, Facebook is by far the biggest defender when it comes to

child sexual abuse content online. This is not just when you compare it to other social media platforms, but when

you compare it to pornography sites as well. According to an independent study released by the National Center for missigan Exploited Children, mind Geek, which is the parent company of Pornhub, RedTube, Uporn, and several other adult tube sites, accounted for thirteen thousand, two hundred and twenty nine instances of what they deemed child sexual abuse material, while Snapchat had one hundred and forty four thousand, ninety five instances. Google had five hundred

and forty six seven hundred and four instances. Facebook topped everybody with twenty point three million instances of this child exploitation material. So if we're actually serious about protecting kids online, it needs to start with looking at the biggest defender, looking at the worst bad actor, and that is undoubtedly, by a by have a vast majority Facebook.

Speaker 2

Jesus, I thought we started out dark, but this is like next level stuff and like it's I got to hand it to Facebook. You know, we've talked about Amazon in this episode. We've talked about Google doing bad stuff, but once again, Facebook just blows away the competition at being unbelievably bad. Actor. Oh, this is disgusting, and I mean in printing in the this, this was in the Wall Street Journal, so like, hopefully something will come from this.

Speaker 1

Well, the Wall Street Journal is also where Francis Hogan published her information about harms that Facebook knowingly did against children. So I don't know, I don't know, they're goals with no, I don't know what to tell you.

Speaker 2

I let's move on it like a well traveled Wall Street Journal to nothing pipeline, got it?

Speaker 1

Uh? Yeah, I mean I wouldn't say nothing. After Francis Hogan published that information in the Wall Street Journal, Facebook did change their name to Meta So we got that out of it, all.

Speaker 2

Right, that's true. Yeah, still working on the legs.

Speaker 1

Yeah, they're not doing there. They're not doing the metaverse anymore. You're never gonna get those legs. Might keep dreaming.

Speaker 2

Okay, thank god, Okay, back to jokes please.

Speaker 1

Okay, So let's let's move on. I know I can't.

Speaker 2

I we have to move on.

Speaker 1

I can't. I can't even sum make it. So let's move back to Twitter. So last month, Elon must announced that he was stepping down as a CEO of Twitter and named a former NBC Universal executive, Linda Yakarino, as the new CEO. Musk initially said that she would take over in about six weeks, but on Monday, after about three weeks, she tweeted that she had officially started, saying it happened first day in the books. So who is

Linda Yakarino. Well, she spent twelve years at NBC Universal, where she was most recently the chairperson of Global Advertising and Partnerships. She was a big part of launching NBC's streaming service Peacock in twenty twenty, which I gotta say shout out to Peacock because I watch a lot of Peacock.

She seems pretty buddy buddy with Elon Musk. At a conference, she interviewed him and said, many of you in this room know me and know that I pride myself on my work ethic, But buddy, I have met my match. She said that to Elon Musk when everybody was kind of criticizing Elon Musk in the early days of him at the helm of Twitter. She urged people to give him a minute, let's see what he's gonna do. She

has some ties to Trump. In twenty eighteen, Yakarino was named by Trump to the President's counsel on Sports and Fitness and Nutrition. The queer focused news outlet them reported that she follows some troubling accounts on Twitter, like libs of TikTok, which we know is known for attacking LGBTQ people and their allies for just like minding their business. Deb notes that she also follows some like stand one

of the mill Twitter accounts, like mainstream journalists. I actually could kind of see her being the kind of person who thinks that if you follow mainstream journalists, that means that you also have to follow accounts like libs of TikTok to sort of represent both sides and have balance. Like I was like, I probably I could easily see her being the kind of person that feels this way. Now, her coming on as CEO does not necessarily signal big

changes at Twitter. Musk is staying on as the owner, and he says that he wants to focus more on like tech issues. Media Matters reported that Twitter's lawyers told a court that hiring Koris the new CEO will not result in different content moderation strategies. So if you had, you know, high hopes for new content moderation strategies that maybe don't allow for things like hate speech, attacks on trans folks, crypto scams, and pornbots, you might be disappointed.

What's also kind of funny to me is that when she was first announced as CEO, a bunch of right wing and extremist influencers criticized her because she had talked about wanting to hire more women and people of color when she was at NBC and they had a video of her in twenty twenty talking about the importance of masks, like that was their big find. Just goes to show

that these people are literally never satisfied. Like anybody that Musk was going to pick, they would still find a way to make that a grievance, to make that an attack on them.

Speaker 2

Yeah, anyone to the left of Mussolini is like unacceptable to these people.

Speaker 1

So she's coming in as CEO at a self inflicted difficult time for Twitter. The platform is not doing well. It's not doing well with this regular users or with advertisers. A recent poll by Pugh found that six and ten adults in the US, which is Twitter's largest user base, were taking a break, and a quarter of them said that they did not expect to be using Twitter in

a year. When it comes to advertisers, of Twitter's top one thousand advertisers from last September before Musk took over, only forty three percent of them were still using Twitter

as of April. Now, Musk had been kind of cagey about this, like he has not liked to talk about the dire financial straits that the company was in, But this week he finally did acknowledge that in a Twitter spaces with presidential candidate and COVID conspiracy theorist Robert Kennedy Junior, where he did admit that the revenue at Twitter had

been cut by about fifty percent. You know, advertisers buying ads on Twitter is how the platform makes money, and basically right now they're not doing that because what advertiser wants to see their branding next to a tweet calling somebody a racial slur or like a porn bot or a scam or something. Right. I saw a pretty interesting piece in BBC about how with Yakanaro's background working with advertisers Musk was basically trying to purchase trust to woo advertisers back.

Speaker 2

Maybe she'll be less like chaotic than Musk. Maybe.

Speaker 1

So if you're thinking that she sounds like she might be a grown up in the room, might be somebody who can sort of steer the ship in a better direction than Musk has. That is a well worn phenomena in business. There is this theory in business called the glass cliff, where women are basically brought in to clean

up the messes made by terrible men in power. I'm a good example of the glass cliff was after the gaming company Blizzard had a massive gender discrimination and harassment lawsuit, they hired in twenty twenty one Jen O'Neil, who was the first female lead of the company, right, who was sort of like brought in seeming lead to kind of

clean up this mess. Now, I absolutely do think that that is what some of what's going on here, that a woman is being brought in to clean up this mess made by a man, and this sort of signal that a grown up is in charge and that it's no longer like a frat boy in charge. But I also think that she is kind of being set up

to fail in a lot of ways. You know, it's tough to be a such a high profile higher one have to appease a demagogue, which we know Elon Musk is too, and then have to handle that, navigate that while also signaling to advertisers that big changes are coming to woo them back. You know, it's this very difficult dance. I would say it's an impossible dance to have to dance.

The way that people are talking about her background really signals to me that she is being set up to have to accomplish something that seems at least very difficult, if not impossible. People have described her as someone who really likes to be a superwoman and someone who can who would see a mess of a company and voluntarily want to go in and feel like they can fix things. By the way, I hate this framing. I can't even

really get, like verbalize why I hate this. There's just something about it that feels very much like women have to be super women to be in the room that they have to be. They can't just be included from the jump. They have to be super women who are not just able, but excited to clean up a mess that was just clearly made by a man who had too much power and didn't know how to handle it, you know.

Speaker 2

Yeah, And if you think about the mess that is Twitter, it really all started when Elon took over, right, Like, I guess, it had financial problems before then, but it wasn't the chaotic, toxic place that it is now. Where I mean, among my friends, I don't know too many people that really spend much time there at all anymore. He's really I don't know, Tank, the brand, whoever you want to say it.

Speaker 1

It'll be interesting to see how bringing on this new CEO will be for Twitter. There was a very interesting piece about the glass Cliff phenomena and wired twitch Yaknaro responded on Twitter quote as someone used to wearing foreign cheels. Let's be crystal clear. I don't teeter which, Okay, I'll give it to her. Kind of a good line. I'm not gonna lie, but it's clear, like we need to be clear, Musk is staying on at Twitter. So even with a new CEO, this is still his company. He's

not going away. And Musk does not have a great track record when it comes to working alongside women. I believe that Mosque is a misogynist. You don't even have to look very far to see the ways in which he is a misogynist either. There are like a hundred different examples, and I could give you off the top of my head, but immediately come to mind if we listed all of the ways that Trump or Trump Oh my god, he's also a massageist. No, keep that shit in.

They're both misogynists. They actually reminded me of each other. A sound surprising to me that I confuse one for the other. So you know, if we listed all the different examples that come to the top of my head, we'd be here all night. He tweeted a crass joke about wanting to start a university called the Texas Institute

of Technology and Science TITS as the acronym. Side note, it's never even a good sexist joke, right, Like, I hate misogyny and sexism, but I love humor, and if a sexist joke was legitimately funny, I would acknowledge it. This isn't even a good joke. It's not even good misogyny. Like if you're gonna tell, if you're gonna be a CEO, that is this powerful and tell misogynistic jokes. Don't give the joke that a million of us have heard in the frat a thousand different times.

Speaker 2

You know it's a bad joke, Like it's it's like a Beavis and butt Head level joke, but like not even good enough to actually be on Beavis and butt Head, which is an excellent show, including the reboot.

Speaker 1

Like I am someone who deeply appreciates stupid jokes and stupid humor, and this is not even like I can't even appreciate it. So the day after he tweeted this, Tesla became the subject of a gender discrimination lawsuit from multiple women, and I remember engineer and tech organizer Share Scarlett tweeted in response to all this, I heard that sexualizing women is a top down, systemic issue in tech. Might be wrong, thou lighty k And to be honest,

that's how you do a funny joke about gender and tech. Elon, take a note.

Speaker 2

Yes, acronyms are not the way.

Speaker 1

Yeah an acronym that spells out tits, which is a body part that is sometimes found on female anatomy. Oh my god, Like, get this, get this guy a Mark Twain Award, somebody. So, according to a report from Business insider, Musk secretly fathered twins with Chavon Zill's, a female executive at his company, Neuralik. So we don't know much about

their relationship. But if you're the head of a company and you're having sex with your underlings and your staff, I feel comfortable saying that, Like, at the very least, it's not a great look. All of this information is public, right, So by all the accounts, Jacanaro, this new Twitter CEO seems like a smart, competent person who would have looked at this information and decided that Twitter and Elon Musk were who and what what she wanted to be mixed

up with. And if that's what you decided, as a confident, smart adult, that's what you wanted. If it doesn't work out, like you kind of in my book, like get what you get, Like this is all public information is if this is the company that you decided that you wanted to stake your reputation on, girl do you but you get what you get, you know, And I want to be clear, I want Yakanero to do well the same way that I want like all women to do well. You know, I'm not like rooting for her to fail.

But to me, doing well is not just bringing back advertisers or appeasing Elon Musk or helping Twitter make money. Doing well would be making Twitter a platform that does not elevate and amplify toxicity, lives, abuse, scams, and harassment. And so I wish her well and hope that she's able to achieve that. But I think that her vision for doing well and what I have just articulated might be two different things.

Speaker 2

Yeah, to be interesting to watch it develop. Uh you know, maybe I've bought into the glass cliff mythology, but uh, yeah, it seems like having having a smart, competent woman in the room probably couldn't hurt and hopefully it'll be good. But I guess we'll just have to wait and see as this Twitter, this latest Twitter drama unfolds.

Speaker 1

Now, I don't trust any institution or company that doesn't have some women at the at the highest levels, the priesthood, the NFL. Uh yeah, if you don't have the military, if there's not women all up through that, I have questions that I'm a little skeptical just in general. That's a blanket statement. If I don't see, if I don't see a woman making decisions something.

Speaker 2

Yeah, it's it's a set of somebody.

Speaker 1

Somebody's tweeting the word tits. So all, let's put it at the very least.

Speaker 2

You know, I've never thought of those three institutions as a set together before, so thank you for that. More.

Speaker 3

After a quick break, let's get right back into it.

Speaker 1

So. I know that this has been a little dark, but I do have a little bit of a good news. Would you like to hear it?

Speaker 2

Yes? Please, So it's.

Speaker 1

Going to start with some bad news, but trust me, it'll get good. Stick with me. I probably don't need to tell y'all that book bands are on the rise in the US, with books that center LGBTQ voices accounting for about forty one percent of challenged books. This is according to Pan America. Attempted book bands at school and public library set a record high in twenty twenty two. This is according to information from the American Library Association. Obviously,

that is terrible, but here's the good news. At a White House Pride event yesterday, Biden announced a slate of actions aimed at curbing discrimination against LGBTQ communities, including the appointment of a new point person who will serve as a coordinator at the Education Department to address the increase in book bands. That's good, I like to see that.

But here's a little bit more positive news. Despite all of those bands and all of those restrictions and all of those attempted attacks by extremists to challenge and ban books about LGBTQ people, people still really really want to read about queer folks and are going out of their

way to do so. Sales of LGBTQ plus books soared to record highs in twenty twenty two, even as a small but vocal army of extremists try to ban queer theme titles and attack inclusive education across the country, US readers bought six point one million LGBTQ plus fiction titles over the past year ending in May twenty twenty three.

This is according to a new report put out on Tuesday by consumer analyst group CIRCANNA that marks an eleven percent jump in growth since last year, when around five million total units were sold, and a massive one hundred and seventy three percent explosions in twenty nineteen before the COVID nineteen pandemic. This is a really important context. The growth in LGBTQ fiction has outpaced the overall market for fiction sales, including adult, kids and young adult fiction combined,

which remained relatively flat. This is from Kristen MacLean, a book industry analyst. So it just goes to show you can try to ban queerness and transness and gainness out of existence, people still want to see it represented. People will go out of their way to get that inclusion because that is what people want. The extremists who are trying to ban and legislate that out of existence are out of step with the desires of the majority of

the country. I believe they are a small but vocal group of extremists, and there are more of us than there are of them, and I think that is news.

Speaker 2

That is good news. Thank you for sharing that. It's good to be reminded that the right wing bigots are a minority and they are freaking out because they know it, and they know that they are going to go extinct.

Speaker 1

So I've got a little bit of bonus good news.

Speaker 2

For you, two different good news stories in one episode.

Speaker 1

It's not really good news exactly, but the good news is that one of my favorite things in the world is low stakes online internet drama where there are no clear like good guys to root for, and the entire internet is like weighing in on what they think. And some of that happened, and so it's good news for me personally, Bridget. So here's what we've been down. There's two moms, let's call one park Mom and the other birthday cake Mom. So park Mom gets on TikTok and

explains the situation. She says that she took her kid to a public park and there was another f family there who is having a birthday party for their kid. Her kid basically starts playing with this other family having this party. It's time to sing Happy Birthday and due cake. So park Mom is watching her kid. It sounds like from the other side of the playground. Park Mom says that she watches her kid get a plate, sing happy Birthday and clearly have the intention that they're going to

have birthday cake. That is when birthday cake Mom takes the plate away from her the kid and says, you're not going to have a piece of cake. These are not your friends, this is not your party. Where is your mother?

Speaker 2

Devastating So park.

Speaker 1

Mom, in her TikTok says basically, am I the asshole? Park Mom is horrified. She cannot believe that birthday cake mom would speak to a kid this way. She cannot believe that birthday cake mom, who had what she describes as a big sheet cake with lots of lots of cake to go around, plenty of cake for her kid, would deny her kid a slice of cake. She asks the Internet on TikTok is she wrong?

Speaker 2

So?

Speaker 1

This is why I love this drama so much. It's one of those things where there is no clear right answer and everybody is weighing in, and like what people weigh in with actually says more about them than it does about the situation, because the situation, we actually don't have a lot of information, like how old are these kids? You know. Some people were like, oh, well, I wouldn't give food to someone else's kid that I didn't know, Like what if they have an allergy? Other people were like, well,

who's allergic to birthday cake? That seems like a weird thing that to just suggest that maybe they're allergyp to birthday cake? Read every comment, Love every minute of it, you know, I it sounds to me like, if I'm being honest, it sounds to me like Park mom just kind of let her kid join this other family and never went over and introduced herself or said like, hey, I'm whoever. Is it cool if my kid hangs out, like, let me know if they get annoying, just assume that

it was fine. Kind of a point against Park Mom. Also, I feel like snatching a plate from a kid and saying these are not your friends a little bit harsh. I probably wouldn't do that, but I'm like, I don't know who is in the right here. I just love that it's low stakes. There's no clear correct answer. Everyone on the internet is weighing in, and their way in says more about who they are and what their values are, and you know who they are as people in the

world that it does about the actual situation. Tad out of ten love this kind of low stakes internet drama.

Speaker 2

Yes, please, It is nice that it's so low stakes, you know. No, Like I feel for the kid who didn't get the cake. It sounds like park Mom was also pretty upset.

Speaker 1

Yes, I'm sat are entitled Mike?

Speaker 2

Yeah, well, you know, I'm trying to refrain from sharing my thoughts here.

Speaker 1

No, what do you you have? You have a take, like what's like, who do you think is in the wrong?

Speaker 2

Well, I mean, just give the kids some damn cake, Like what the hell? Like, one person's an adult, the other person is a kid who wanted who like came to sing the birthday song and like play with some other kids and like have some cake. Like regardless of who's like truly morally in the right or like violating someone else's personal property or whatever. Like one of them is a kid who sang the birthday song and wanted a piece of cake. The other one's an adult.

Speaker 1

Well, I saw a lot of people on the internet being like this kid the birthday cake. Mom probably was annoyed by this kid the whole time and was probably like,

where's your mom to come get you? And was like just in this it not not to like excuse her behavior, but was like in one of those situations where you're getting more and more and more annoyed and then you're just like, now it's coming out of an outburst that doesn't really makes sense, and it's kind of that the wrong person, which I know very well in my personal life. But yeah, I kind of see all sides. I think that I have been park mom I probably would have.

I find it frankly unfathomable that at this that someone could watch their kid playing with another family, but it wouldn't occur to them at some point to go up town and be like, hey, like I'm Sally, this is my kid, Like this is fine, You're you're cool with this. I think it's a It is surprising to me that that a mom wouldn't, you know, do that. I think that's probably what I would do. Uh probably honestly would

have like solved this whole thing. It also was surprising to me that a grown woman would be like snatching a plate from a child. That's that's why it's so good, Because everybody but this kid has done something that I would not suggest adults do. Everybody's behavior is a little bit off.

Speaker 2

Maybe that's part of what makes this so satisfying. It's almost like reality TV, where like like am I the asshole? Like, actually everyone's the asshole? You know, park mom probably should have been keeping it better around her kid. Birthday Mom probably could have given the kids some cake. I guess I do want to give birthday mom a little bit of grace here, because like I've thrown a party in my life, you know, a couple of times, and it's stressful, and it kind of makes me a bit of a

head case. I like to think that I would still be like kind to a child, but you know, it's a high pressure thing. Maybe she had trouble tracking down the sheet cake, and like the kids keep running around and yelling and like where's birthday?

Speaker 1

Dad?

Speaker 2

Is he helping? I don't hear anything about him in this story.

Speaker 1

It makes it better that it's two moms. I feel, yeah, it actually I think I think I was telling you about this that it reminds me so you and I had very I know that you and I had very different upbringing sess children. I grew up in the suburbs. You grew up on a farm and like in a very rural community, so this was not something that you really knew about. But when I was a kid, I remember I have several memories of in the summertime going

out to play on our street with other kids. One of the other kids would be like, hey, you want to come back to my house. I would say sure. I would go back to this kid's house, like a couple of blocks away. I would get there, their mom would be there, so they'll be like, oh, you are now expected to like keep an eye on this like stranger kid from down the street, and half the time I think we would expect food. So it's like, oh, hi, Mom, it's not just me the kid that you had that

you you know, aren't caring for. It's a stranger's kid that you don't even know, and she expects lunch. Like I think there is something kind of entitled about sending your kids with strangers and expecting them to be fed. But I absolutely that was my I mean, like maybe I misremembering, but I feel like that's how it was when I was growing up.

Speaker 2

Yeah, that was different from my upbringing. You know, there were like four other kids within a mile radius of our house, right Like I knew them all and also needed a ride from my parents if I wanted to go see them, So that, just like straight was not happening.

Speaker 1

So I want to know where you stand on this drama. Again, it's low stakes, so no judgment. If you would have snatched that paper plate from that kid so fast it makes your head spend, I want to know about it. If you would have gotten Mom's Venmo information to Venmo her for her share of the cake. I want to know that too. I'm obsessed. Let me know. Hit me up on email at Hello at tango dot com. You can find me on social or you can subscribe to

our Patreon. We are having interesting conversations over there. Thank you so much to the folks who have subscribed. I've got some really exciting stuff coming up on Patreon soon. I'll give you a little teaser. It involves Elon Musk and Azalea Banks, my problematic fave. So please subscribe. We would love to have you there, and thanks so much for listening. We'll see you next time.

Speaker 2

Yeah, so listening, and thanks for having me.

Speaker 1

Bridget always Mike. If you're looking for ways to support the show, check out our merch store at tangody dot com slash store. Got a story about an interesting thing in tech, or just want to say hi, You can reach us at Hello at tangody dot com. You can also find transcripts for today's episode at TENG Goody dot com.

There Are No Girls on the Internet was created by me Bridget tod It's a production of iHeartRadio and Unbossed Creative, edited by Joey pat Jonathan Strickland as our executive producer. Tari Harrison is our producer and sound engineer. Michael Amado is our contributing producer. I'm your host, Bridget Todd. If you want to help us grow, rate and review.

Speaker 3

Us on Apple Podcasts.

Speaker 1

For more podcasts from iHeartRadio, check out the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

Transcript source: Provided by creator in RSS feed: download file
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast