Tesla AutopilotEnlarge Photo
In Tesla’s earnings call last week, Tesla CEO Elon Musk addressed a spate of crashes with cars using the automaker’s Autopilot system, noting that the crashes usually seem to involve experienced Tesla drivers who become complacent in using the system.
On Monday, an article in the Wall Street Journal revealed, citing “people familiar with the discussions,” that Tesla had considered additional sensors such as eye-tracking cameras and sensors that can determine when a driver’s hands are on the wheel but rejected the suggestions, perhaps for cost but also because Musk said drivers might find the additional warnings annoying.
DON’T MISS: Fatal Tesla Autopilot crash: ‘system safeguards lacking,’ says NTSB (2016)
Some other automakers use such sensors on their adaptive driver assist systems, such as Cadillac with its Super Cruise and Mercedes-Benz with its Distronic system. Many others use steering wheel torque sensors similar to Teslas that rely on small movements that drivers’ hands make constantly on the steering wheel to verify that the driver is paying attention.
Following another publicized crash of a Tesla Model S in South Jordan, Utah, last Friday, in which the driver reportedly told police that she had Autopilot engaged, Musk responded on Twitter, saying:
It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage https://t.co/6gD8MzD6VU— Elon Musk (@elonmusk) May 14, 2018
Adaptive driver assist systems such as Autopilot have indeed been shown to reduce crashes by compensating for momentary lapses of driver attention.
These systems include advanced safety features such as automatic emergency braking, which can automatically apply the brakes in an emergency to avoid a crash, and active steering which can intervene to keep the car from running off the road.
Those systems are not designed, however, to take over driving for long periods of time. They lack the sensors and software that self-driving test cars such as those being operated by Waymo and Uber rely on to identify a wide variety of traffic hazards.
It seems some Tesla drivers have become overconfident in a system called Autopilot, which started out allowing drivers to keep their hands off the wheel for as long as 30 minutes.
After a series of safety upgrades that give drivers more and more warnings and shut the system down sooner if they don’t demonstrate control of the car, some Tesla drivers have expressed frustration on owner forums about the frequency of warnings and automatic system shutdowns.
In response to the Wall Street Journal article, Tesla issued a statement saying, “…On the Autopilot team…we make decisions based on what will improve safety and provide the best customer experience, not for any other reason.”
A spokesman followed up with the Journal and added: “We've explored many technologies and opted for the combination of a hands-on-wheel torsion sensor with visual and audio alerts, and we will of course continue to evaluate new technologies as we evolve the Tesla fleet over time."