Two Deaths and the Future of Autonomous Cars

Create a vendor selection project & run comparison reports
Click to express your interest in this report
Indication of coverage against your requirements
A subscription is required to activate this feature. Contact us for more info.
Celent have reviewed this profile and believe it to be accurate.
1 April 2018
Donald Light

There have been two recent deaths that have significant implications for the development of autonomous vehicles. On March 23, a man suffered fatal injuries driving a Tesla Model X in Mountain View, California that crashed into a concrete highway divider. Then on March 25 in Tempe, Arizona an Uber test vehicle collided with a woman pedestrian who subsequently died.

The Tesla was in “Autopilot” mode at the time of the Mountain View crash. The Uber vehicle was operating in an autonomous mode at the time of the accident, with an emergency back-up emergency driver in the car. The National Transport Safety Board (NTSB) has announced it is investigating both incidents.

The Tesla accident raises more questions about its Autopilot feature. A March 30 Tesla blog post revealed that “Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

This raises more questions than it answers. If the driver had “five seconds and 150 meters of unobstructed view of the concrete divider,” would not the Tesla Autopilot have had the same time and view? And if so, what actions did the engaged Autopilot take to mitigate the collision?

This accident appears to extend the questions posed about Tesla’s Autopilot features by the NTSB investigation of May 2017 of the fatal crash in Florida when a Tesla vehicle was also in Autopilot mode.

More information is available about the Uber fatality—primarily in the form a video released by the Tempe Police Department, as well as subsequent statements by that Department. These videos show the exterior view in front of the vehicle immediately before it struck the pedestrian; as well as an interior view of the back-up emergency driver during the same time period.

The video is deeply disturbing for several reasons. First and foremost, it shows a woman immediately before her death. The video shows the woman (who is walking a bicycle directly across the path of the car) appear in the illumination of the car’s headlights for one or two seconds before the car hits her. Early police reports state that neither the vehicle nor the driver slowed to any “significant” extent. The second portion of the video, shows the emergency back-up driver’s eyes looking down for several intervals immediately before the accident.

Here are the immediate implications for the development of autonomous vehicles.

  1. The Uber vehicle’s radar and/or lidar devices should have been able to perceive a pedestrian walking into the car’s path AND taken at least moderately effective remedial action (swerving and/or braking). It appears that did not happen. Was this a design flaw, or equipment failure, or both?
  2. The Uber emergency back-up driver, tragically, did not have "eyes on the road" constantly. Even with "eyes on the road," how quickly could a human driver using human eyesight see the potential collision and taken effective action?
  3. This highlights the dangers inherent in SAE International Standard J3016 Level 3 Conditional Autonomy—in which “the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task with the expectation that the human driver will respond appropriately to a request to intervene.”
  4. Quite possibly the Tempe accident will lead developers of autonomous vehicle to avoid Level 3 and focus on Levels 4 and 5.
  5. Tesla (and the NTSB) need to determine if there were design or equipment failures in the Mountain View crash.

When all or nearly all vehicles are fully autonomous, there will still be some traffic fatalities. But, in my opinion, society will find that a much lower level of fatalities is an acceptable (though tragic) price that it is willing to pay.

Lastly a personal note. I’ve written several times about the impact of autonomous vehicles and other technologies on auto insurance (here, here, here, and many other times). When the Uber crash occurred, coincidently I was in Tempe staying in a hotel about a mile and half from the site of the accident. On the day of the crash I drove on a route back to my hotel which almost took me by the crash site. I’ve also driven the route where the Tesla crash occurred several times. Those experiences brought home to me the truth that the stakes in this technology are very high--not just for insurers, but for all of us.

Insight details

Content Type
Blogs
Location
Asia-Pacific, EMEA, LATAM, North America