Every three months, Tesla releases a safety report that shows the number of miles between crashes when drivers use the company’s driver assistance system, Autopilot, and the number of miles between crashes when they don’t use it.
Posted at 11:45 a.m.
These figures still show that accidents are less frequent with the Autopilot system, a set of technologies capable of steering, braking and accelerating Tesla vehicles autonomously.
But these numbers are misleading. Autopilot is primarily used for highway driving, which is typically twice as safe as driving on city streets, according to the Department of Transportation. There may be fewer accidents with Autopilot simply because it is generally used in safer situations.
Tesla did not provide data that would compare Autopilot’s safety on the same types of roads. Other automakers that offer similar systems haven’t either.
Autopilot has been on public roads since 2015. General Motors (GM) introduced Super Cruise in 2017, and Ford Motor released BlueCruise last year. But publicly available data that reliably measures the safety of these technologies is scarce. American drivers – whether they use these systems or share the road with them – are effectively guinea pigs in an experiment whose results have yet to be revealed.
Automakers and technology companies are increasingly adding features to vehicles that they claim improve safety, but these claims are difficult to verify. Meanwhile, the country’s road and street death toll has been steadily rising in recent years, reaching its highest level in 16 years in 2021.
It would seem that all the additional security brought by technological advances does not compensate for the poor decisions of drivers behind the wheel.
“There is a lack of data that would give the public confidence that these systems, the way they are deployed, are living up to the expected safety benefits,” said J. Christian Gerdes, professor of mechanical engineering. and co-director of the Automotive Research Center at Stanford University, who served as the first Chief Innovation Officer for the Department of Transportation.
GM collaborated with the University of Michigan for a study that explored the potential safety benefits of the Super Cruise, but concluded that it did not have enough data to determine whether the system reduced crashes.
Reveal the numbers
A year ago, the National Highway Traffic Safety Administration (NHTSA), the government’s auto safety regulator, ordered companies to report potentially serious crashes involving Autopilot-like advanced driver assistance systems. within a day of their discovery. The order said the agency would make the reports public, but it has yet to do so.
The security agency declined to comment on the information it gathered, but said in a statement that the data would be released “in the near future”.
Tesla and its CEO, Elon Musk, did not respond to requests for comment. GM said it reported two incidents involving Super Cruise to NHTSA: one in 2018 and one in 2020. Ford declined to comment.
The agency’s data is unlikely to provide a complete picture, but it could encourage lawmakers and drivers to take a much closer look at these technologies and ultimately change the way they are used. marketed and regulated.
Despite its capabilities, Autopilot does not relieve the driver of responsibility. Tesla asks drivers to stay alert and be ready to take control of the car at all times. The same goes for BlueCruise and Super Cruise.
But many experts worry that these systems, because they allow drivers to relinquish active control of the car, may trick them into thinking their car is driving itself. Then, when technology malfunctions or cannot handle a situation on its own, drivers may not be prepared to regain control as quickly as needed.
Older technologies, like automatic emergency braking and lane departure warning, have long provided safety nets for drivers by slowing or stopping the car or warning them when they stray from their lane. But the new driver assistance systems reverse this by making the driver the safety net of technology.
Security experts are particularly concerned about Autopilot because of the way it’s marketed. For years, Mr. Musk has claimed that his company’s cars are on the verge of achieving true autonomy, meaning driving themselves in virtually any situation. The name of the system also implies automation that technology has yet to achieve.
This can lead to some complacency on the part of the driver. Autopilot has played a role in many fatal crashes, in some cases because drivers weren’t ready to take control of the car.
Mr. Musk has long promoted Autopilot as a way to improve safety, and Tesla’s quarterly safety reports seem to prove him right. But a recent study by the Virginia Transportation Research Council, a division of the Virginia Department of Transportation, shows that these reports are not what they seem.
“We know that cars using Autopilot have fewer crashes than when Autopilot is not used,” said Noah Goodall, a Council researcher who explores issues of safety and operation of autonomous vehicles. “But are they driven the same way, on the same roads, at the same time of day, by the same drivers? »
By analyzing police and insurance data, the Insurance Institute for Highway Safety, a nonprofit research organization funded by the insurance industry, found that older technologies, like automatic emergency braking and lane departure warning, have improved safety. But the body says studies have yet to show that driver assistance systems provide similar benefits.
Part of the problem is that police and insurance data does not always indicate whether these systems were in use at the time of the accident.
The Federal Automobile Safety Agency has ordered companies to provide data on crashes in which driver-assist technologies were used within 30 seconds of impact. This could give a broader picture of the performance of these systems.
But even with this data, it will be difficult, security experts say, to determine whether using these systems is safer than disabling them under the same circumstances.