¶ Welcome & Episode Overview
Welcome to the Tech Meem Right Home for Thursday, June 5th, 2025. I'm Brian McCullough. Today, Reddit goes after Anthropic. Meta seems to be readying a much cheaper Vision Pro killer while still moving forward with their ambitious smart glasses products. What if your Amazon delivery person was not a person at all, but a humanoid robot? And how, quietly, Hollywood studios are already deep into AI adoption. Here's what you missed today in the world of tech.
¶ Reddit Sues Anthropic Over Data
Reddit is suing Anthropic, alleging that Anthropic accessed Reddit more than 100,000 times after saying it would stop doing so. You might remember Reddit has reached formal AI licensing deals with the likes of OpenAI and Google, so this is like that line in Goodfellas, F you pay me, quoting the journal.
Reddit said the AI company unlawfully used Reddit's data for commercial purposes without paying for it and without abiding by the company's user data policy, according to the complaint, which was filed Wednesday in California. Anthropic is in fact intentionally trained on the personal data of Reddit users without ever requesting their consent, the complaint says, alleging that Anthropic's conduct runs counter to how it, quote, bills itself as the white knight of the AI industry, end quote.
A spokeswoman for Anthropic said the company disagrees with Reddit's claims and will defend itself vigorously. Last year, Reddit took steps to try to limit unauthorized scraping of its website. creating a public content policy for its user data that is publicly accessible, such as posts on subreddits and updating code on its backend. The user policy includes protections for users, such as ensuring that deleted posts and comments aren't included in data licensing agreements,
Reddit said it had tried and failed to reach an agreement with Anthropic and that it found Anthropic accessing its site after the AI company said it had blocked its bots from doing so. Reddit has struck Lucrative licensing deals with AI platforms, including OpenAI and Google. AI executives have said that Reddit data is uniquely useful in helping to train models, given that it hosts conversational interactions between humans.
There are more than 100,000 so-called subreddits. Reddit's lawsuit references a 2021 anthropic research paper that details the usefulness of Reddit data in AI model training, end quote.
¶ Meta's Cheaper VR and Smart Glasses
Sources tell the journal that Meta has talked with Disney, A24, and other studios and companies about exclusive content for a premium VR device it plans to launch next year. for less than $1,000. Quote, it is offering millions of dollars for episodic and stand-alone immersive video based on well-known intellectual property. Meadow hopes the content will attract people to the VR device it plans to launch next year that would compete.
with Apple's Vision Pro. The new device, known internally as Loma, is more powerful than the MetaQuest VR headsets now available with higher fidelity video. The design is similar to a large pair of eyeglasses, more like Meta's Ray-Ban AI glasses than the goggles that the Quest and Vision Pro use, connected to a puck.
users can put in their pockets. Meta is planning to charge less than $1,000 for the device, more than the $300 starting price for a Quest, but less than the $3,500 base price for a Vision Pro, according to some of the people. The company has told potential partners that it wants the content it pays for to be exclusive for VR devices, but producers could sell the shows or movies to other platforms after a certain window of time.
Selling 2D versions to a streaming service would allow producers to earn more money and to reach a wider audience than the small number of people likely to have the meta device when it launches. Apple didn't pay entertainment companies for content to launch Vision Pro, but has produced exclusive immersive videos, including a concert with The Weeknd. Factors including its price, weight, and lack of new apps have contributed to lackluster sales for the device." It would be quite something.
If Meta were to outcompete the Vision Pro while also outracing Apple to mainstream smart glasses like it seems like they're already doing. I mean, you know, hardware, consumer hardware is Apple's thing, right? Not Meta's? Meta has revealed more information about its research glasses, Area Gen 2.
including their form factor, computer vision cameras, sensors, and on-device compute. Quoting The Verge, The glasses pack several improvements into their lightweight frame that could one day translate into consumer products, including an improved eye tracking system that can track gaze per eye, detect blinks, and estimate the center of pupils.
These advanced signals enable a deeper understanding of the wearer's visual attention and intentions, unlocking new possibilities for human-computer interaction, Meta writes. Meta initially announced ARIA Gen 2 in February, saying they will pave the way for future innovations that will shape the next computing platform. They built upon Meta's first iteration of the glasses in 2020, which were similarly available for researchers only.
Along with an improved eye tracking system, ARIA Gen 2 comes with four computer vision cameras that Meta says enable 3D hand and object tracking. Meta says researchers can use this information to enable highly precise tasks like dexterous robot hand manipulation. the glasses also have a photo whatever
long word sensor built into the nose pad, which allows the device to estimate a wearer's heart rate along with a contact microphone that Meta says provides better audio in loud environments. There's a new ambient light sensor as well, allowing the glass
to differentiate between indoor and outdoor lighting. The Aria Gen 2 glasses include folding arms for the first time weighing around 75 grams and come in eight different sizes. Meta plans on opening applications for researchers to work with Aria Gen 2 later.
this year. The initiative builds on the successful development of Meta's Ray-Ban smart glasses, a form factor it aims to expand with its Orion augmented reality glasses, a rumored partnership with Oakley, and a high-end pair of Hypernova glasses with a built-in
¶ OpenAI Challenges Order to Save Logs
screen. OpenAI is seeking to block a May 13th court order requiring it to preserve all chat GPT logs, including deleted chats. They're arguing that that would pose a risk to users' privacy, which it would. Quoting Ars Technica, The court order came after news organizations expressed concern that people using ChatGPT to skirt paywalls, quote, might be more likely to delete all their searches to cover their tracks, OpenAI explained.
Evidence to support that claim, news plaintiffs argued, was missing from the records because so far OpenAI has only shared samples of chat logs that users had agreed that the company could retain. Sharing the news plaintiff's concerns, the judge, Ona Wang, ultimately agreed that OpenAI likely would never stop deleting that alleged evidence absent a court order, granting news plaintiffs requests to preserve all chats.
OpenAI argued the May 13 order was premature and should be vacated until, quote, at a minimum, news organizations can establish a substantial need for OpenAI to preserve all chat logs. They warned that the privacy of hundreds of millions of ChatGPT users globally is at risk every day that the, quote, sweeping unprecedented order continues to be enforced.
As a result, OpenAI is forced to jettison its commitment to allow users to control when and how their chat GPD conversation data is used and whether it is retained, OpenAI argued. Meanwhile, there is no evidence beyond speculation yet supporting claims that OpenAI... had intentionally deleted data, OpenAI alleged, and supposedly there is not a, quote, single piece of evidence supporting claims that copyright-infringing ChatGPT users are more likely to delete their chats, end quote.
So here's hoping your ChatGPT chats don't become public at some point.
¶ Amazon Explores Humanoid Delivery Robots
Okay, little ankle robot delivery bots have not even become widely used yet, but would you accept an Amazon delivery from a humanoid robot that walked up and knocked on your door? Quoting the information. Amazon is developing software for humanoid robots that could eventually take the jobs of delivery workers according to a person who has been involved in the effort. As part of the project, Amazon is putting the finishing touches on a humanoid park.
An indoor obstacle course at one of the company's San Francisco offices where it will soon test such robots, this person said. The move comes as numerous large firms across the tech industry, including NVIDIA, Google, and Tesla, develop software or hardware for humanoid robots, which have been buoyed by recent advances in AI, allowing robots to better mimic human movements and respond to unfamiliar situations.
Amazon has been at the forefront of robotics for years and has automated some warehouse jobs by developing or acquiring robots that sort through and prepare parcels for delivery. But it also has a large financial incentive to automate the delivery of those parcels, given it oversees hundreds of thousands of people who handle such work.
Amazon is developing the AI software that would power such robots, this person said, and plans to use hardware from other firms in its tests for now. The initial obstacle course it is developing is roughly the size of a coffee shop, this person said. That plan mirrors the way self-driving car developers develop their vehicles by first testing them in closed courses before expanding to public street testing about a decade ago.
Amazon hopes humanoid robots will be able to hitch a ride in the back of Amazon's electric Rivian vans and spring out to deliver packages. Amazon's human delivery workers currently use more than 20,000 Rivians to get around, and the company has placed one Rivian van inside its humanoid park to help with the testing, this person said.
Amazon has said it will be adding more Rivian vans to its delivery fleet in the coming years, reaching 100,000 electric vehicles on the road in 2030. It isn't clear if the company will test the robots using other delivery vehicles. Even if Amazon keeps a human delivery driver behind the wheel, the humanoid robot could theoretically help deliver packages to one building or house while the driver delivers to another, speeding up delivery times overall, end quote.
There's a new way to have a great time with friends without the booze or, you know, the bad decisions. Cornbread Hemp's THC seltzers just came out, and you've got to try them. I've got them in my fridge right now. Finally, a THC seltzer that tastes and feels amazing. perfect for spring and summer. This is a low-calorie drink with only 30 calories and 5 grams of sugar made with pure THC and all-natural ingredients. No synthetics. Each can has 5 milligrams of THC, which is the perfect amount.
so you don't feel couch locked or paranoid. Perfect for unwinding, kicking back and enjoying the moment without alcohol or a hangover. Four delicious flavors to choose from. Blueberry breeze, peach iced tea, raspberry limeade, and salted watermelon. Every single flavor I've tried is legit delicious. Right now, Tech Meme Ride Home listeners can save 30% off their first order and enjoy free shipping on orders over $75. Head to cornbreadhemp.com slash ride and use code ride at checkout.
cornbreadhemp.com slash ride code ride. Cornbread Hemp. This is the good life. When you're starting off with something new, it seems like your to-do list keeps growing every day with new tasks, and that list can easily begin to overrun your life. Finding the right tool that not only helps you out but simplifies everything can be such a game changer. For millions of businesses, that tool is Shopify.
Shopify is the commerce platform behind millions of businesses around the world and 10% of all e-commerce in the U.S. From household names like Mattel and Gymshark to my own ResumeWriters.com.
Get started with your own design studio with hundreds of ready-to-use templates. Shopify helps you build a beautiful online store to match your brand's style. Accelerate your content creation. Shopify is packed with helpful AI tools that write product descriptions, page headlines, and even enhance your product. Bye. Bye. Bye. Bye. national shipping to processing returns and beyond. Turn your big business idea into with Shopify on your side.
Sign up for your $1 per month trial and start selling today at Shopify.com slash ride. Go to Shopify.com slash ride. Shopify.com slash ride.
¶ Hollywood Quietly Adopts AI Technologies
Finally today, let's come back to the Hollywood tip for a second. Insiders tell Vulture... that widespread off-the-books AI experimentation is rife at Hollywood studios, often hidden due to the desire to avoid a backlash. See, the 2023 writers' and actors' strikes culminated in landmark contracts that added guardrails around AI's use. Studios are barred from using AI-generated scripts or digitally cloning actors without consent or compensation.
But those agreements left space for certain AI-assisted techniques such as generating backgrounds, effects, or synthetic performers as long as humans stay in charge and the unions can bargain terms. Even so... There are over 35 copyright lawsuits out there challenging the legality of AI models trained on vast troves of scraped content. As one producer told Vulture, the biggest fear in all of Hollywood is that you're going to make a blockbuster and sit in litigation for 30 years.
Despite the uncertainty, every major studio, according to Vulture, is quietly exploring AI's potential. Some are partnering with artist-focused AI studios like Asteria, which claims to be the first... ethical generative video model trained solely on licensed material. Another leader in the space is, of course, Runway, whose tools are already in use across Hollywood.
Darren Aronofsky recently announced a partnership with Google DeepMind, and James Cameron is collaborating with Stability AI. We need to cut costs in half, Cameron told Vulture bluntly. Studios under pressure are now asking how AI can help close the gap between rising ambitions and shrinking budgets. Runway's trajectory offers sort of a case study in how quickly AI is being accepted in some corners of Hollywood.
Co-founded by Cristobal Valenzuela in 2018, it began as an experimental art tool that quickly gained traction among visual effects artists. Rendering traditional VFX is expensive and time-consuming, so AI-powered shortcuts... were appealing to a lot of folks. By 2023, Runway could create complex effects in minutes. Valenzuela's meeting with Lionsgate vice chairman Michael Burns led to the studio offering to let Runway train models on its film archive.
Burns even used AI-generated scenes to create a trailer for a film that hadn't yet been shot, intending to pre-sell it at a film festival. It might cost $10 million, he said, but it'll look like $100 million. Still, public examples of AI-generated content in mainstream film and television remain a bit scarce. Runway was used in Everything Everywhere All at Once and Amazon's House of David, but many projects are still hush-hush.
Studios are wary of union scrutiny and potential backlash from audiences. SAG-AFTRA, for instance, now meets with each major studio twice a year to track AI use. According to the union's chief negotiator, Duncan Crabtree Ireland, Most current applications are limited to sound cleanup or visual polish, though many believe that restraint is only temporary.
Behind closed doors, apparently off-the-books experimentation is running rampant. Writers, animators, and illustrators report being asked to use AI without formal approval. In some cases, AI-generated images are laundered. cleaned up by artists to mask their origin. Many artists say work is drying up generally, so they feel pressure to use AI themselves. One concept artist who has worked on The Hunger Games and The Matrix Resurrections.
told Vulture that his income has been cut in half. Quote, if you're a storyboard artist, a studio executive admitted, you're out of business unless you learn to prompt. Technological shortcomings still limit broader adoption, at least for now. Joel Kuahara, co-founder of BentoBox, which makes Bob's Burgers.
is apparently an advocate for AI's potential, but found many of the existing tools ineffective. One storyboard generator, for instance, couldn't follow simple camera prompts, he told Vulture. People ask why it takes four to six weeks to storyboard a show, because you can't. make your brain work faster. He seems to feel that the tools may improve, but at least for now, quality and craft still take time. The question, though, is whether quality still matters. As much, at least.
A VFX expert working under a nondisclosure agreement said, AI-generated effects often look worse than traditional ones, but... quote, most people won't notice. He described meetings where executives were enchanted by the idea of AI producing full episodes overnight. Some of them walk out needing a shower, he said.
AI's data requirements and how it gets them are another controversy. As mentioned, most tech companies argue scraping is the only way they can get content, but Muser, founder of Asteria, believed a cleaner path was possible and so backed by investors like Vinod Khosla and General Catalyst, Asteria licensed data directly from filmmakers and archives. Its model called Mari was built by a DeepMind spin-out named Moon Valley.
The company claims its tools are crafted specifically for filmmakers blending traditional artistry with AI. A recent... Asteria project illustrates this sort of hybrid approach that they're taking. For a music video, illustrator Paul Flores created 60 original sketches, which were used to train a custom model in his style. The team then used AI to generate hundreds of images, build 3D environments,
and produce a richly layered animated short. According to VFX Insiders, the process was far from automatic. They're not just hitting buttons, said one artist. It takes 20 steps to get that image. Asteria is now in discussions with nearly every major studio and developing its own projects, including a sci-fi film from Natasha Lyonne, Brit Marling, and VR pioneer Jaron Lanier.
Leon says, AI is not a shortcut, but a tool for independence, a way for creators to wrest control back from studios. Her dream is a new wave of artist-first filmmaking, a kind of dogma 95 for the AI era. We need to wrangle this.
before it becomes standard not to, she said. What struck me about the piece, which again, you should read the whole thing, it's really a long read, was how much this all just came down to speed, which really is money. It's all about money. Here's a quote to describe what I mean.
The collaboration with Runway was open-ended and flexible. We're banging around the art of the possible, Byrne said. Let's try some stuff, see what sticks. He listed some ideas they were currently considering. With a library as large as Lionsgate's, they could use Runway to
repackage and resell what the studio already owned, adjusting tone, format, and rating to generate a softer cut, say, for a younger audience or convert a live-action film into a cartoon. He offered an example of one of Lionsgate's signature action franchises. He would have to pay the actors and all the other rights participants, he added, but I can do that and now I can resell it, end quote.
He gave another example. We have this movie, we're trying to decide whether to green light, he said. There's a 10 second shot, 10,000 soldiers on a hillside with a bunch of horses in a snowstorm. To shoot it in the Himalayas would take three days and cost millions. Using Runway, the shot could be created for $10,000. He wasn't sure the film would be made at all, but the math was working." Nothing more for you today. Talk to you tomorrow.