Federal safety officials say Tesla Inc.’s controversial Autopilot semi-automated driving aid was partly to blame for a highway crash in California last year.
The National Transportation Safety Board says the driver was inattentive and misused the system, which can steer, brake and accelerate a Tesla electric car automatically under certain conditions.
The California crash occurred when a large vehicle being followed by the Tesla car changed lanes, revealing the fire truck ahead. The Autopilot system failed to detect or brake for the truck. There were no serious injuries in the 31-mph crash that resulted.
Tesla’s owners’ manuals and the Autopilot system itself repeatedly warn users that they must remain ready to take control of their car at any moment. The manual also cautions about the system’s ability to automatically follow traffic and emphasizes that Autopilot is not a collision warning or avoidance system.
But NTSB also criticizes Tesla for making it too easy for drivers to divert their attention anyway while their vehicles are in Autopilot mode.
The board’s report says the 47-year-old Tesla driver had purchased his car six months earlier in part because of the appeal of the Autopilot feature. He admitted to not reading the owner’s manual. According to an eyewitness cited by the report, he appeared to be looking down at a mobile device just before the crash.
There have been more than 20 reported attacks against Waymo’s self-driving fleet in Chandler, Ariz., since the company began testing the technology on public roads there two years ago.
When you think of complex, highly technical devices that you use every day in your car—in fact, possibly as much as three to 10 times per minute—you probably don’t think of your rearview mirror.
Elio Motors is something of a brash company.