US News
US Investigates Tesla's Recall of 2M Autopilot Vehicles
By Frederick Pollich
April 28, 2024
The National Highway Traffic Safety Administration (NHTSA) has opened an investigation into Tesla's recall of over 2 million vehicles in December to install new Autopilot safeguards. The move comes following a series of accidents involving cars equipped with the updated software.
The NHTSA reported receiving details about 20 crashes linked to vehicles that had undergone the necessary updates under Tesla's recall program. This latest inquiry adds another level of regulatory scrutiny for Autopilot as CEO Elon Musk continues his push towards full self-driving capabilities.
Tesla is currently offering a month-long free trial and plans to debut its robotaxi on August 8th. However, the NHTSA's concerns raised by preliminary tests on updated vehicles and reports regarding the aforementioned crashes could jeopardize these ambitious plans.
In a related development, the agency recently concluded its almost three-year defect investigation into Autopilot, citing evidence that "Tesla's weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities." This was highlighted as creating a "critical safety gap."
In August of last year, the NHTSA launched a probe into at least 13 Tesla-related accidents, which resulted in one death and several severe injuries due to apparent system misuse by drivers.
Despite issuing additional software updates addressing these issues since then, Tesla hasn’t made them part of their recall process yet. In response to this situation, US senators Ed Markey and Richard Blumenthal have urged NHTSA to prevent Tesla from using Autopilot on roads where it isn't intended for use—essentially anywhere it poses potential danger.
Furthermore, there are concerns that calling the feature 'Autopilot' may mislead drivers into believing they can fully trust automated driving more than they should be able to do so. Despite falling shares and criticism from research organizations such as Consumer Reports regarding inadequate changes post-recall update, Tesla maintains that while Autosteer—a key component of Autopilot—can maintain speed by following distance and lane discipline, it does not make the vehicle autonomous.
The company has consistently emphasized that the purpose of Autopilot is to support a vigilant driver who is prepared to take control at any moment. Tesla also disclosed in December that, despite their disagreement with the NHTSA's analysis, they are prepared to implement a software update that incorporates additional controls and alerts, promoting drivers' continuous responsibility whenever they engage Autosteer.
Since 2016, over 40 special investigations by NHTSA into crashes suspected of being linked to Tesla's driver systems, such as Autopilot, have been opened, with reports indicating up to 23 crash-related deaths so far. The recall software upgrade enhances visual alerts and disengages Autosteer if drivers fail to heed warnings about inattentiveness.
Tesla had previously disclosed receiving subpoenas related to its Full Self-Driving (FSD) and Autopilot from the US Justice Department last October. Additionally, Tesla recalled another 362k US vehicles for FSD Beta software updates earlier this year in February due to concerns about compliance with traffic safety laws.
LATEST ARTICLES IN US News
TikTok's Misinterpretations of America | The Strategist.
San Diego County Earthquake.
Youth's Heightened Passion for Gaza Conflict Explained.
Is NGAD Worth It?.
Join Our Newsletter
Popular Articles
-
Mar 13, 2024
Anyone But You - A Romantic Comedy Surprise of 2023 -
Feb 01, 2024
AI Company About to Revolutionize the Medical Space? -
Mar 20, 2024
COVID-19 Survivors at Risk for Autoimmune Diseases -
Jan 27, 2024
Get Rich in a Year with These 3 Coins!