Tesla is coming under fire for using the term Autopilot to describe the set of driver-assist systems in its newer cars. Most recently, the German government has asked it to find a more-conservative way to brand the feature, one that will be less confusing to owners. The Germans, like Consumer Reports, feel Tesla owners may be putting too much faith in the system, and as a result are increasing the risks of a serious accident.
As expected, Tesla has responded aggressively that it believes its German drivers are perfectly capable of understanding the limits of the system. The Germans may be particularly sensitive after a Tesla on Autopilot crashed into a bus. However, in that case, Tesla says the driver said the Autopilot software had nothing to do with the accident.
Autopilots have been around for nearly a century. Early versions fitted to aircraft did little more than hold an altitude for level flight. By contrast, the newest versions can essentially fly and land a plane with little human intervention. Even with highly trained pilots, though, when automation is failing, it can be difficult for the human to effectively re-engage — something often called the Handoff Problem (PDF). A tragic version of that issue contributed to the Asiana crash at San Francisco Airport.
To some extent there are always teething pains with new automation technologies. Anecdotes abound of drivers crashing after believing their vehicles’ cruise control could also steer their cars or RVs. However, those systems didn’t do any steering, so the negative feedback was fairly quick. With assisted steering systems, the car may successfully drive for minutes or even hours, and then suddenly require human intervention.
Tesla defends its Autopilot system both by stressing it “tells drivers they need to keep their hands on the wheel” (but it doesn’t enforce that), and by saying drivers must accept that the software is in beta. Huh? It is one thing to take personal responsibility for running beta software on a device that only affects its owner (like your laptop). But it’s another to glibly accept responsibility for hurtling a multi-ton vehicle at high-speed around others with software that is not fully tested. Industry insiders at other auto companies remain shocked that Tesla has been able to do this, mostly successfully.
In Tesla’s defense, the system’s overall track record appears to be pretty good. Tesla claims over 200 million miles of driving with Autopilot engaged, and just the one fatality where the system failed. Tesla has done a massive overhaul of the system in the wake of the now infamous fatal accident in Florida, by moving from Mobileye’s cameras to radar as its primary sensor.
Videos like this one showing the driver looking around, and narrating his own review of the system while it’s on, show that Tesla drivers don’t all take the fine print very seriously:
Since most people drive several different brands of cars over time (whether because they have a multi-car family, or rent cars, or drive a friend’s car), having a confusing array of terms to describe overlapping safety, convenience, and navigation functions is a recipe for trouble. It’s already hard to get in a rental car and sort out which, if any, safety features it includes. Tesla’s use of Autopilot to aggressively describe capabilities that other companies call Automated Emergency Braking, Lane Keeping, and other more conservative terms doesn’t help.
In the meantime, expect Tesla to continue to take heat after every incident where its Autopilot is activated — whether or not it’s implicated as a cause of the crash.