Faulty driving assistance system | Recall of more than 2 million Tesla vehicles

(Detroit) Tesla is recalling almost all vehicles sold in the United States, more than two million cars, for a software update to fix a faulty system meant to ensure drivers pay attention when using the Autopilot system.



Then, a little later on Wednesday, Transport Canada indicated that Tesla was recalling at least 193,000 cars to correct the same problem.

The federal agency says Tesla will provide an over-the-air software update to improve advanced driver assistance features in its cars sold in Canada.

The agency’s reaction comes after US safety regulators investigated a series of accidents when Tesla Autopilot, which is its partially automated driving system, was used.

Tesla has recalled more than two million cars from its range, produced between October 5, 2012 and December 7 this year, following the investigation by US safety regulators.

Documents released Wednesday by U.S. regulators say the update will increase warnings and alerts to drivers and even limit areas where basic versions of Autopilot can operate.

The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that occurred while the Autopilot partially automated driving system was in use. Some have proven fatal.

The agency says its investigation found that Autopilot’s method of ensuring drivers are paying attention may be inadequate and may lead to “predictable misuse of the system.”

The added controls and alerts “will further encourage the driver to adhere to their continued driving responsibility,” the documents state.

But safety experts said that while the recall is a good measure, it still places the blame on the driver and doesn’t address the underlying problem, which is that Tesla’s automated systems have trouble spotting obstacles in their path and stop there.

PHOTO PROVIDED BY TESLA

Tesla Model Y

The recall affects models Y, S, 3 and X produced between October 5, 2012 and December 7 of this year.

Tesla shares fell more than 3% on Wednesday.

Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on limited-access highways when not working with a more sophisticated feature called Autosteer on City Streets.

Updating the software will limit the use of Autosteer. “If the driver attempts to activate Autosteer when the conditions are not right to do so, the function will notify the driver that it is not available through visual and audible alerts, and Autosteer will not activate. not,” the recall documents state.

Depending on the Tesla’s equipment, additional controls include “increased prominence” of visual alerts, a simplification of the procedure for turning Autosteer on and off, and additional checks to know if Autosteer is used off-road controlled access and approaching traffic control devices. A driver could be suspended from using Autosteer if they repeatedly fail to “demonstrate continued and sustained driving responsibility,” according to the documents.

According to recall documents, agency investigators met with Tesla beginning in October to explain “tentative findings” regarding the monitoring system repair. Tesla disagreed with the NHTSA analysis, but agreed to the recall on December 5 in an effort to resolve the investigation.

For years, auto safety advocates have called for stricter regulation of the driver monitoring system, which primarily detects whether the driver’s hands are on the steering wheel. They called for cameras to ensure the driver is paying attention, as other automakers do with similar systems.

Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that doesn’t take into account the lack of night vision cameras to monitor vehicles. eyes of drivers, nor the fact that Teslas fail to spot obstacles and stop.

“The compromise is disappointing because it doesn’t solve the problem of older cars not having adequate hardware to monitor the driver,” Koopman said.

Mr. Koopman and Michael Brooks, the executive director of the nonprofit Center for Auto Safety, argue that crashing into emergency vehicles is a safety deficiency that goes unaddressed. “The investigation does not get to the root of the problem,” Brooks said. This doesn’t answer the question of why Teslas on Autopilot don’t detect and respond to emergency activities. »

Mr. Koopman added that NHTSA had apparently decided that the software change was the most it could get from the company, “and the benefits of doing it now outweigh the costs of spending another year arguing with Tesla.”

In its statement Wednesday, NHTSA said the investigation remains open “as we monitor the effectiveness of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety.”

Autopilot can automatically steer, accelerate and brake within its lane, but it is a driver assistance system and cannot drive itself, despite its name. Independent tests have found that the monitoring system is easy to fool, so much so that drivers have been caught driving drunk or even sitting in the back seat.

In its defect report filed with the safety agency, Tesla said Autopilot’s controls “may not be sufficient to prevent driver misuse.”

NHTSA has dispatched investigators to 35 Tesla crashes since 2016 in which the agency suspects the vehicles were operating with an automated system. At least 17 people were killed.


source site-54