Tesla Crash shows that man and machine must cooperate


Almost as soon as news of a fatal crash involving Tesla’s Autopilot broke last year, fans and critics of the electric car maker were clear about the causes of the tragedy. Tesla supporters and investors have never doubted that the system improves safety, so the driver must have failed to heed Tesla’s warnings and stay alert. Detractors and short-term investors alike are almost certain that Autopilot failed to protect the driver of the car, allowing him to drive straight into a tractor-trailer at 74 mph.

After more than a year of debate, a conclusive answer is finally within reach, thanks to a National Transportation Safety Board investigation whose final results were released this week. But the board’s conclusions are not likely to please either side: rather than blaming man or machine alone, it appears that human drivers and the autopilot system – particularly the complex relationship between two – contributed to the fatal event.

At the heart of the problem is a dangerous dynamic: With billions at stake in the rat race to develop self-driving car technology, automakers have a strong incentive to make the vehicles on sale today appear “self-driving”. “. But as the NTSB has made clear, no vehicle currently on the market is capable of safe autonomous driving. When consumers take high-tech hype at face value, a deadly gap between perception and reality can open up.

US updates self-driving car guidelines as more hit the road

Tesla reaped months of rave coverage and billions in market capitalization by touting its Autopilot system as more autonomous than any other advanced driver assistance system, even as it warned owners to stay alert. and in control at all times. Although Autopilot performed better than other advanced driver assistance systems, key to its success was the lack of Tesla-imposed limitations on its use. Because Autopilot allows owners to drive hands-free anywhere, even on roads where Tesla warned such use would not be safe, the company was able to take advantage of the perception that its system was more autonomous than the others.

But the Autopilot was actually designed for use on well-marked and protected highways, without the risk of cross-traffic. So when the tractor-trailer crossed Florida’s Highway 27 last May and the Tesla rammed it without triggering any safety systems, Autopilot was working exactly as intended. The problem was that it was being used on a road with conditions it was not designed for, and the driver had apparently been lulled into complacency. Far from failing, the autopilot was actually so good that it tricked the driver into believing he was more capable than he actually was.

This complex failure, to which man and machine have contributed, sounds an important warning about self-driving technology: until the systems are so good that they need no human intervention, the human driver must remain at the center of the “semi-autonomous” driving system. pattern. Engineers have to assume that if there is a way for people to misuse these systems, they will. Equally important, companies need to understand that if they over-promote the capabilities of a semi-autonomous driving system in hopes of getting ahead in the race for autonomy, they run the risk of making the technology less safe than an unassisted human driver.

There is a lesson to be learned here from aviation. As computers and sensors improved in the 1980s, aircraft manufacturers began to automate more and more controls just because they could. It wasn’t until later that the industry realized that adding automation for automation’s sake actually made planes less safe, so they redirected autopilot development around the principle human-centric automation. It’s only when automation is deployed in a way that improves driver performance that safety really improves.

On the contrary, this dynamic will be more pronounced with automobiles, which are used much more than airplanes by much less trained people. But unlike aircraft manufacturers, which are joining forces to improve safety across the industry, automakers and tech startups are in intense competition for the real or perceived lead in the race for autonomy.

As long as consumers care more about the futuristic cool factor of hands-free operation than about using technology to become safer drivers, the potential for a dangerous gap between perception and reality of self-driving technology remains. . And what a pity it would be if this technology, which has the potential to one day save tens of thousands of lives each year, made cars less safe in the short term.

© 2017 Bloomberg L.P.

Tech

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button