Sorting by

×

Tesla to recall more than two million cars over Autopilot safety fears

Tesla will recall more than two million vehicles over worries its hands-free Autopilot system can be misused by drivers, following a two-year investigation by car safety regulators in the US into crashes involving driver-assistance technology.

The recall covers some Tesla Models 3, S, X and Y sold between 2012 and 2023, the US National Highway Traffic Safety Administration (NHSTA) said. The regulator warned Tesla’s Autopilot system may not have sufficient controls in place to prevent “driver misuse”.

It said the risk of a crash increased when Tesla’s Autopilot is engaged and a driver doesn’t maintain control of the vehicle or is unprepared to intervene. Its conclusions were based on a review of 956 crashes where Autopilot was said to have been used.

Tesla’s Autopilot is intended to enable cars to steer, accelerate and brake automatically within their lane, while enhanced Autopilot can assist in changing lanes on highways but does not make them autonomous.

In a statement the regulator said: “Automated technology holds great promise for improving safety, but only when it is deployed responsibly. This action is an example of improving automated systems by prioritising safety.”

It is unclear whether recall affects Tesla vehicles in the UK. All its models are sold with Autopilot as standard but UK law prevents autonomous vehicles on the road unless they are part of an approved trial.

Tesla said it disagreed with the regulators findings but will be sending over-the-air software updates that will “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged”.

The update would also eventually suspend a driver from using Autosteer if they “repeatedly fail to demonstrate continuous and sustained driving responsibility while the feature is engaged”, Tesla added.

Tesla revealed in October it had been served with legal subpoenas from the US Justice Department about its Full Self-Driving (FSD) and Autopilot systems.

Bryant Walker Smith, a University of South Carolina law professor, told Reuters that a software-only fix will be fairly limited. The recall “really seems to put so much responsibility on human drivers instead of a system that facilities such misuse,” he said.

Philip Koopman, Professor of Electrical and Computer Engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that fails to address a lack of night vision cameras to watch drivers’ eyes, as well as Tesla’s failures to spot and stop for obstacles. “The compromise is disappointing because it does not fix the problem that the older cars do not have adequate hardware for driver monitoring,” he said.

Professor Koopman and Michael Brooks, of the US safety champion Centre for Auto Safety, criticised Tesla’s software update. “It’s not digging at the root of what the investigation is looking at,” Brooks said. “It’s not answering the question of why are Teslas on Autopilot not detecting and responding to emergency activity?”

The NHTSA said its investigation will remain open as it monitors the effectiveness of Tesla’s remedies. Tesla and NHTSA held several meetings since mid-October to discuss the regulators conclusions on potential driver misuse and Tesla’s proposed software remedies in response.

Source link

Related Articles

Back to top button