The May 7 fatal crash involving a Tesla Model S driving on Autopilot and ensuing National Highway Traffic Safety Administration (NHTSA) investigation have generated intense discussion regarding autonomous-driving systems.
But the crash may highlight another issue with the way autonomous technology is being deployed in production cars.
Namely, that this technology represents unknown territory for U.S. safety regulators.
The Tesla Autopilot investigation is the first to look at how autonomous-driving systems work in the real world, and how real drivers interact with them, notes Automotive News (subscription required).
That offers an opportunity to answer some questions about the true potential benefits—and possible pitfalls—of automated cars, but it's not something the NHTSA is equipped for.
Regulators will be required to determine whether the crash—which involved a Model S colliding with a tractor trailer—was the result of a malfunction, engineering issues with the Autopilot system, or human error.
Tesla Autopilot suite of features - with version 7.0 update
Nor are there concrete rules in place governing the operation of autonomous systems.
In contrast to other safety features, which are regulated by very specific standards, the NHTSA has resisted issuing rules for autonomous-driving systems.
That's partially because the agency is afraid regulating the technology too quickly will hinder innovation.
Both the NHTSA and U.S. Transportation Secretary Anthony Foxx have promoted increased automation—up to and including fully-autonomous cars—as a potential safety breakthrough.
The agency plans to issue non-binding guidelines for automated systems later this month.
Tesla Model S owner tests Autopilot system from back seat
In its blog post responding to the crash, Tesla noted that "neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly-lit sky."
Tesla did not specifically comment on whether Autopilot was operating correctly or not, something that will be looked at in the NHTSA investigation.
Even if the system is found to be functioning perfectly, regulators may still conclude that it poses a safety risk because of its inability to deal with situations like the one that led to the crash.
Since launching Autopilot last October, Tesla has been quick to note that the system is still in the "beta" testing stage, and that functionality is limited.
Yet not all drivers have heeded that warning, and there's no guarantee they will in the future.