Tesla’s autopilot controversy began anew in 2023 when the automotive and energy company was instructed by the National Highway Traffic Safety Administration (NHTSA) to roll out an autopilot software update to nearly every Model S, Y, X and 3 currently in use – 2,031,220 cars, to be exact.
NHTSA’s safety recall notice cites the risk when Tesla’s Autosteer is engaged, and the likelihood of drivers misusing the SAE Level 2 advanced driver-assistance autopilot feature.
Tesla has since issued a response, titled ‘The Bigger Picture on Autopilot Safety.’
Tesla’s autopilot software investigation
The NHTSA conducted investigations into 956 crashes where autopilot was alleged to have been in use.
They narrowed it down to a set of 322 autopilot-involved crashes, and called on Tesla to stipulate the “remedies deployed across the recall scope reported in its December 12, 2023 recall filing.”
Following the update, several media outlets including The Washington Post and The Verge reported on the matter. The Post’s Geoffrey Fowler says he “didn’t feel much safer” after the update was rolled out.
Meanwhile, Clean Technica’s Steve Hanley says “autopilot has always been a potentially misleading term that is writing checks it cannot cash.”
View the NHTSA request for vehicle logs here.
Tesla’s Response
In an update posted on Tesla’s blog, the manufacturer says the recent articles “do not accurately convey the nature of [the] safety system.” It referred to the Washington Post as being “particularly egregious in its misstatements and lack of relevant context.”
Tesla stated the following as facts:
- Safety metrics are stronger when Autopilot is engaged.
- Autopilot features are SAE Level 2 driver-assist systems.
The company cites 3-year-old data published by NHTSA and FHWA showing “that in the United States there was an automobile crash approximately every 652,000 miles.” According to “recent data,” which Tesla didn’t provide a link to, the findings are “even more compelling.”
“Autopilot is ten times safer than the US average and five times safer than a Tesla with no AP tech enabled. More detailed information will be publicly available in the near future.”
Tesla further states that drivers are in control of the vehicle “at all times,” whether they choose to Autosteer or not. Additional safety measures – including torque-based and camera-based monitoring – are in place to keep tabs on a driver’s behavior.
‘Elon Mode’ explained
Mention of a ‘secret Elon Mode’ also resurfaced in 2023. Tesla vehicles aren’t fully autonomous. Drivers need to occasionally touch the steering wheel to show they are available to take over when needed.
With ‘Elon Mode’, users could reportedly bypass that restriction and access a fully autonomous mode. When Elon Musk live-streamed himself driving while using his phone in August last year, it only added fuel to the fire.
Three PhD students at the Technische Universität Berlin (TU Berlin) – Christian Werling, Niclas Kühnapfel and Hans Niklas Jacob – claim to have hacked into the inner circuit board of Tesla’s Autopilot to access the hidden ‘Elon Mode.’
The trio explains: “Using voltage glitching, they were able to “extract a hardware-unique key used to authenticate Autopilot towards Tesla’s ‘mothership’.”
Their findings can be viewed on Chaos Computer Club.
Share this article
About the author
Cheryl has contributed to various international publications, with a fervor for data and technology. She explores the intersection of emerging tech trends with logistics, focusing on how digital innovations are reshaping industries on a global scale. When she's not dissecting the latest developments in AI-driven innovation and digital solutions, Cheryl can be found gaming, kickboxing, or navigating the novel niches of consumer gadgetry.