Tesla’s Full Self-Driving Software Under Investigation - podcast episode cover

Tesla’s Full Self-Driving Software Under Investigation

Dec 03, 20248 min
--:--
--:--
Listen in podcast apps:

Episode description

In this episode, we unpack the federal investigation into Tesla’s Full Self-Driving (FSD) software, exploring what this means for the company, its technology, and the future of autonomous vehicles. From fatal crashes to legal challenges, we examine the details surrounding this critical story.

  1. What’s the Investigation About?

    • NHTSA’s inquiry into Tesla’s FSD software after four crashes in reduced visibility conditions.
    • Details on the incidents, including a fatal crash in Rimrock, Arizona.
    • The scope of the investigation, covering 2.4 million Tesla vehicles from 2016-2024.
  2. Key Crashes Under Scrutiny:

    • Breakdown of four accidents where Tesla’s FSD struggled in low-visibility scenarios.
    • How weather conditions like fog, sun glare, and dust may challenge the system’s capabilities.
  3. Tesla’s Self-Driving Claims vs. Reality:

    • Elon Musk’s bold promises about autonomous driving, including the Cybercab robotaxi prototype.
    • Tesla’s reliance on camera-based systems and why experts say it might not be enough.
    • How Tesla’s competitors use lidar and radar for improved performance in challenging conditions.
  4. Legal and Regulatory Challenges:

    • Overview of Tesla’s ongoing legal issues, including lawsuits and a DOJ investigation.
    • NHTSA’s previous investigations into Autopilot and ongoing concerns about driver engagement.
    • The potential for recalls and regulatory hurdles ahead for Tesla.
  5. The Bigger Picture for Tesla:

    • How these challenges impact Tesla’s push for fully autonomous vehicles.
    • The implications for Tesla’s market strategy amid growing competition and slowing demand.
  • Can Tesla’s FSD software handle poor visibility effectively, or are its limitations too risky?
  • Will this investigation derail Elon Musk’s ambitious robotaxi vision?
  • What does this mean for Tesla’s future as a leader in self-driving technology?


Transcript

Hey, everybody. Welcome back to the Elon Musk podcast. This is a show where we discuss the critical crossroads that shape SpaceX, Tesla, X, The Boring Company, and Neuralink. I'm your host, Will Walden. How effective is Tesla's full self-driving system under poor visibility? And also, what led a new federal investigation targeting Tesla's driver assistance tech? Could this scrutiny disrupt Elon Musk's robo-taxi?

Now, the National Highway Traffic Safety Administration, the NHTSA, announced a preliminary evaluation into Tesla's FSD technology after four reported crashes between November 2023.

and May of 2024. These incidents all occurred under reduced visibility conditions such as fog sun glare or airborne dust the agency's focus is whether the software is capable of detecting and responding appropriately in such scenarios one of these incidents in Rimrock, Arizona, involved a Tesla Model Y striking and killing a pedestrian. The crashes have prompted concerns over the system's reliability in non-ideal driving environments, leading to broader inquiries into its operational safety.

The investigation affects approximately 2.4 million Tesla vehicles equipped with FSD, spanning models released between 2016 and 2024. This includes Model S, Model X, Model 3, and Model Y, and also the 2020. 2024 Cybertruck. The preliminary evaluation is just the first step in a potentially lengthy process that could lead to recalls if the vehicles are deemed a safety risk.

The investigation will assess whether Tesla has made updates to the FSD system to improve its handling of poor visibility. And the NHTSA is also evaluating the safety implications of the updates. examining their timing purpose and effectiveness now the four crashes under investigation are under certain scenarios with s at d appeared to falter

Rimrock, Arizona in November 2023. I already spoke about that when a pedestrian was killed after being struck by a 2021 Tesla Model Y. January 2024 in Nipton, California, a Model 3 collided with another vehicle during a dust storm. And Red Mills, Virginia in March of 2024, a Model 3 was involved in a crash on a cloudy day. And Collinsville, Ohio, May 2024.

a model 3 hit a stationary object on a foggy rural road resulting in injuries now the nhtsa is seeking to determine if these incidents are indicative of broader systematic issues within the full self-driving system

Tesla has long marketed FSD software as a step toward autonomous driving, though it currently requires driver supervision. And on its website, Tesla clearly states that FSD is not a fully autonomous system in advisors' users to remain attentive during... driving behind the wheel the system has faced scrutiny throughout the years with critics arguing that tesla's reliance and a camera only approach without supplemental sensors like radar or lidar may limit its reliability in adverse conditions

And some industry experts suggest that weather-related challenges could be a critical hurdle for Tesla's self-driving. Now, this latest probe is not Tesla's first encounter with regulators. Earlier in 2023... nhtsa closed an investigation into tesla's less advanced autopilot system after examining nearly 500 crashes that inquiry identified 13 fatal accidents

leading to heightened concerns about Tesla's driver assistance technology. Tesla also faces lawsuits and government investigations into its claims about autopilot and FSD. One high-profile case, Tesla reached a settlement early this year, avoiding a trial over an autopilot-related crash. Meanwhile...

The Department of Justice has issued subpoenas related to Tesla's self-driving claims, and the California Department of Motor Vehicles has accused the company of overstating the system's capabilities. Now, Elon Musk has frequently touted Tesla's advancements in autonomous technology, positioning it as a cornerstone of the company's future. Just a week before the investigation was announced, Musk unveiled a prototype of the CyberCab.

two-seater robo-taxi concept without a steering wheel or pedals. Now, Musk claimed that Model 3s and Model Y vehicles would operate without supervision in California and Texas by 2025, but offered no specifics on how this would be achieved. However, regulatory hurdles loom large. For Tesla to deploy vehicles without traditional controls, approval from NHTSA would be required. This poses a huge challenge as the agency investigates Tesla's current technology.

and its ability to ensure safety. Tesla's approach to autonomous driving relies exclusively on camera-based vision systems powered by artificial intelligence. a method some experts argue is vulnerable to environmental limitations, and competing companies such as Waymo and Cruise incorporate LiDAR and radar technology to enhance their system's ability to navigate complex environments.

Jeff Schuster, who is a vice president at Global Data, noted that weather conditions could impede the performance of camera-based systems, adding that regulatory oversight may delay Tesla's deployment of fully autonomous vehicles. And the investigation adds to more pressure on Tesla as it faces increased competition in the electric vehicle market. The company's focus on self-driving technology comes amid slowing demand for its core vehicles.

and rivals are making strides in the robo-taxi space with more robust sensor arrays. Tesla's legal troubles and regulatory scrutiny could also impact investor confidence. While Tesla's stock rose slightly after the announcement of the investigation, prolonged uncertainty could weigh on the company's market position. Now, what's next? The NHTSA's preliminary evaluation is expected to last up to eight months.

If the agency identifies significant safety risks, it could escalate the investigation to an engineering analysis or recommend a recall. Tesla has not commented publicly on the latest probe, leaving questions about its next steps unanswered. The agency is also examining whether Tesla drivers are adequately adhering to instructions to remain engaged while using FSD. Past investigations in autopilot revealed instances where drivers relied too heavily on the system.

leading to accidents now tesla has faced other fatal accidents involving its driver assist systems in one instance in april of 2023 a model s operating in fsd mode struck and killed the motorist near seattle Such incidents have drawn criticism from safety advocates who argue that Tesla's branding of the software as full self-driving can mislead users into overestimating its capabilities.

Now, the company's December 2023 recall of more than 2 million vehicles to address issues with its autopilot system also remains under review. The NHTSA is still evaluating whether those measures adequately resolve concerns about drivers not paying attention. hey thank you so much for listening today i really do appreciate your support if you could take a second and hit this subscribe or the follow button on whatever podcast platform that you're listening on

Right now, I greatly appreciate it. It helps out the show tremendously, and you'll never miss an episode. And each episode is about 10 minutes or less to get you caught up quickly. And please, if you want to support the show even more, Go to patreon.com slash stage zero. And please take care of yourselves and each other. And I'll see you tomorrow.

This transcript was generated by Metacast using AI and may contain inaccuracies. Learn more about transcripts.