After an investigation, the Florida Highway Patrol has concluded that Tesla’s autopilot was not at fault in the well-known crash that occurred last summer. The finding, which echoes the National Highway Traffic Safety Administration’s (NHTSA) conclusion and Tesla’s statements on the incident, was detailed in a report released to The Verge.
The collision, which resulted in the death of the driver after the vehicle crashed into the side of a semi truck that was turning, gained notoriety after the discovery that the car was using Tesla’s Autopilot feature. Tesla’s Autopilot is semi-autonomous, meaning it is capable of maintaining a car’s speed and lane position, but still requires the drivers’ attention and forces them to keep their hands on the wheel.
Tesla’s investigation into the crash had noted that the Autopilot system failed to detect the high white side of the truck against the clear sky, and that if the vehicle had impacted the wheels or cab of the truck, the crash avoidance systems would have been more effective. However, the car hit between the wheels, and the Autopilot systems had a hard time detecting the high-riding truck bed.
Human error appeared to be the culprit in this case, as the separate investigations note the two drivers “failed to observe” each other before the collision. The Tesla’s driver “took no evasive action” to prevent the collision, and the car’s telemetry data demonstrated that the brake pedal was never pressed. Ultimately, investigators determined that the semi driver was at fault since he did not give right of way when attempting a left-hand turn.
Earlier, NHTSA had noted that the collision warning system was not designed to alert drivers to vehicles crossing lanes or turning across lanes, and Tesla vehicles suffered 40 percent fewer crashes after the introduction of the Autopilot feature, lending credence to the carmaker’s claims that the feature is statistically safer than manual driving. Tesla CEO Elon Musk firmly believes the feature has the ability to improve safety and save lives.
However, this incident shows that systems like Autopilot may not be 100 percent ready for full autonomy. It further demonstrates the need for drivers using the system to pay attention, and act as a failsafe if need be.