After testing on public roads, Tesla is rolling out a new feature of its partially automated driving system designed to spot stop signs and traffic signals.
The update of the electric car company’s cruise control and auto-steer systems is a step toward CEO Elon Musk’s pledge to convert cars to fully self-driving vehicles later this year.
But it also runs contrary to recommendations from the U.S. National Transportation Safety Board that include limiting where Tesla’s Autopilot driving system can operate because it has failed to spot and react to hazards in at least three fatal crashes.
In a note sent to a group of Tesla owners who were picked to test the stop light and sign recognition feature, the company said it can be used with the Traffic Aware Cruise Control or Autosteer systems. The feature will slow the car whenever it detects a traffic light, including those that are green or blinking yellow. It will notify the driver of its intent to slow down and stop, and drivers must push down the gear selector and press the accelerator pedal to confirm that it’s safe to proceed.
The company warns in the note obtained by The Associated Press that drivers must pay attention and be ready to take immediate action “including braking because this feature may not stop for all traffic controls.”
The note says that over time, as the system learns from the fleet on the roads, it “will control more naturally.”
Tesla didn’t respond to multiple requests for additional details, but the website Electrek.co reported last week that the new feature is being sent to the wider Tesla fleet as part of an over-the-Internet software update for thousands of vehicles. The feature won’t come until later in other parts of the world, the website said.
Local
The National Highway Traffic Safety Administration, the U.S. government’s road safety agency, said in a prepared statement Monday that the agency “will closely monitor the performance of this technology,” adding that drivers must be ready to act and law enforcement agencies will hold them responsible.
Jason Levine, executive director of the Center for Auto Safety, a nonprofit watchdog group, said Tesla is using the feature to sell cars and get media attention, even though it might not work. “Unfortunately, we’ll find out the hard way,” he said.
Whenever one of its vehicles using Autopilot is involved in a crash, Tesla points to “legalese” warning drivers that they have to pay attention, Levine said. But he said Tesla drivers have a history over-relying on the company’s electronics.
Missy Cummings, a robotics and human factors professor at Duke University, fears that a Tesla will fail to stop for a traffic light and a driver won’t be paying attention. She also said Tesla is using its customers for “free testing” of new software.
She also fears that the cars will stop for green lights and their drivers won’t react in time to keep moving, causing more rear-end collisions.
The NTSB has ruled in three fatal crashes that Tesla’s Autopilot system was partly to blame, and it has expressed frustration with NHTSA for failing to act on the board’s recommendations. Last month the board, which has no regulatory powers, took the unusual step of accusing NHTSA of contributing to the cause of a March 2019 Tesla crash in Florida.
The March 1, 2019, crash in Delray Beach, Florida, killed the 50-year-old driver of a Tesla Model 3. The car was traveling 69 miles per hour (111 kilometers per hour) when neither the driver nor the Autopilot system braked or tried to avoid a tractor-trailer that was crossing in its path. The car struck the trailer, which sheared off the Tesla’s roof. The report also blamed the truck and the Tesla driver in the crash.
NTSB Chairman Robert Sumwalt said in March that the crash was the third “where a driver’s overreliance on Tesla’s Autopilot and the operational design of Tesla’s Autopilot have led to tragic consequences.”
NHTSA said it will review the NTSB’s report.
The Delray Beach crash was remarkably similar to one in 2016 in Williston, Florida, which also killed a Tesla driver. In that crash neither Autopilot nor the driver stopped for a crossing tractor-trailer.
Tesla maintains that its vehicles operating on Autopilot are about twice as safe as those in which the system isn’t engaged. The company says in the fourth quarter, drivers using Autopilot had one crash for every 3.07 million miles driven.