U.S. auto safety regulators said Monday they had opened a formal safety probe into Tesla Inc’s driver assistance system Autopilot after a series of crashes involving emergency vehicles.

The National Highway Traffic Safety Administration (NHTSA) said that since January 2018, it had identified 11 crashes in which Tesla models “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.”

It said it had reports of 17 injuries and one death in those crashes.

The NHTSA said the 11 crashes included four this year, most recently one last month in San Diego, and it had opened a preliminary evaluation of Autopilot in 2014-2021 Tesla Models Y, X, S, and 3.

“The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes,” the NHTSA said in a document opening the investigation.

The probe covers an estimated 765,000 Tesla vehicles in the United States, the NHTSA said in opening the investigation.

The NHTSA has in recent years sent numerous special crash investigation teams to review a series of Tesla crashes.

It said most incidents took place after dark and the crash scenes encountered included measures like emergency vehicle lights, flares or road cones.

The NHTSA said its investigation “will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.”

Because the NHTSA could demand a recall, it must first decide to upgrade a preliminary investigation into an engineering analysis.

Autopilot, which handles some driving tasks and allows drivers to keep their hands off the wheel for extended periods, was operating in at least three Tesla vehicles involved in fatal U.S. crashes since 2016, the National Transportation Safety Board (NTSB) has said.

System Safeguards

The NTSB has criticized Tesla’s lack of system safeguards for Autopilot.

In February 2020, Tesla’s director of autonomous driving technology, Andrej Karpathy, identified a challenge for its Autopilot system: how to recognize when a parked police car’s emergency flashing lights are turned on.

“This is an example of a new task we would like to know about,” Karpathy said at a conference.

In one of the cases, a doctor was watching a movie on a phone when his vehicle rammed into a state trooper in North Carolina.

Tesla did not immediately respond to a request for comment.

In June, the NHTSA said it had sent teams to review 30 Tesla crashes involving 10 deaths since 2016 where it is suspected advanced driver assistance systems were suspected of use.

In a statement, the NHTSA reminded drivers “no commercially available motor vehicles today are capable of driving themselves … Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”

In January 2017, the NHTSA closed a preliminary evaluation into Tesla’s Autopilot covering 43,000 vehicles without taking any action after a nearly seven-month investigation.

The NHTSA said at the time it “did not identify any defects in the design or performance” of Autopilot “nor any incidents in which the systems did not perform as designed.”