Three months later, the May 7 death of a Model S driver in Florida continues to generate headlines and spark debate on the merits of electronic driver-assistance technologies.
But a recent story points out at least one of the potential future advantages to self-driving cars: their ability to transport incapable occupants to their destinations.
Or, in the case of one Tesla electric-car owner suffering an acute pulmonary embolism, to the nearest emergency room.
The story appeared a few days ago in a broader discussion of these issues in Slate, with the clever title, "Code is my copilot."
Right below the title, it asked, "Tesla insists its controversial autopilot software is saving lives. Can it convince the rest of us?"
The episode was simple but dramatic: on July 26, Joshua Neally was driving his week-old Tesla Model X electric utility vehicle home from work in Missouri.
2016 Tesla Model X
He began to suffer sudden chest pains he described as, "like a steel pole through my chest."
After calling his wife, he agreed to go to the nearest emergency room, believing it would be faster than waiting for an ambulance.
And he apparently programmed that destination into the Tesla's Autopilot beta software.
Neally says he doesn't remember a lot of the trip, but the car navigated 20 miles of highway for him, and he was promptly treated when he reached the emergency room.
UPDATE: An earlier version of this story incorrectly suggested the Model X had driven itself down the off-ramp. Autopilot kept the car in its lane at highway speeds, and by the time Neally had reached the exit, he was able to disengage the system, exit, and drive to the hospital.
2016 Tesla Model X
As the article notes, Neally "wonders whether, without Autopilot, he might have lost control of the car and ... become a deadly projectile when those first convulsions struck."
It also notes that the situation is unusual, to say the least.
Neally says he pays attention with Autopilot engaged, avoiding hands-free activities that would distract him—though he also admits he "sometimes checks email or sends text messages on [the] phone."
The promise of self-driving cars has long been theoretical, with rosy visions of autonomous transportation for elderly people, the disabled, and children and teens below driving age.
The Neally case offers a practical example of the potential benefits.
But not everyone is convinced.
A rather contentious piece on the tech blog Mashable suggested yesterday that the Neally case is not, as some have suggested, a counter-example to the fatal Florida crash.
Instead, it says, the story is "an account of yet another Tesla owner misusing and over-utilizing Autopilot."
It quotes an emergency-room physician saying that 15 or 20 extra minutes would not likely have made much difference in Neally's outcome, although that was something Neally couldn't have known at the time.
It further suggests that if he was conscious enough to engage Autopilot to take him to the emergency room, and park the vehicle at the hospital, he was conscious enough to call 911 and wait by the side of the road for an ambulance.
The better outcome isn't really possible to know, because there's no way to do a comparable study of the two different sets of actions.
For now, Neally considers the Autopilot in his Tesla Model X potentially a lifesaver. And his case offers one example of the potential benefits of autonomous vehicles.
That's likely to be valuable while we are still early enough in the discussion that aggregate data on the behavior of Autopilot-equipped cars isn't sufficiently thorough to provide a solid comparison to the safety record of those driven by live human beings.
Those human drivers exhibit widely varying degrees of attentiveness—as virtually every driver knows from daily driving.
[hat tip: Randall Hamlet]