Autonomous Car Accidents

Autonomous car accidents can be avoided with smart customer success initiative

Car

Fanciful science fiction to reality is how Forbes describes autonomous driving technology1. It reminds that journey of dream to reality never goes as it is planned. The obstacles are probably worst when it effects any uninvolved third party. This is exactly what happened when Elaine Herzberg passed away on Mar 18, 2018 after colliding with an Uber test vehicle2. She was a pedestrian crossing the road, unaware of the test vehicle. This was not the first fatality involving (partially) autonomous cars. Five driver fatalities involving Tesla autopilot have been reported since 20163.

Consequences

This series of accidents have raised serious concerns over the safety and future of the technology4 which can slow down adoption of the technology. A similar phenomenon occurred in early 20th century with unmanned automatic elevators. Adoption of the (then) new technology took few decades 5. This was a consequence of negative perception, amongst the public. Similar perception about (partially) autonomous vehicles can have major impact on the evolution of the technology. The journey from science fiction to reality may get extended.

Adoption is critical to evolution of the technology because Artificial Intelligence (AI) that (occasionally) controls the vehicles requires training  to develop into fully autonomous driving technology. Apart from more data, AI requires data from diverse set of legislations and geographies. An AI that is experienced in driving in The Netherlands will not work in India because of difference in rules, driving practice, pedestrian behaviour and infrastructure. A negative public opinion will limit the ‘experience’ of the AI within certain boundaries. More the limits, slower will be the evolution. Society may have to wait longer to enjoy a safer ride, efficient traffic and environmental sustainability that the technology promises 6.

Negative perception can have disastrous consequences on the companies involved as well. It can negatively affect sales and existing orders. Grounding of Boeing 737 Max after series of accidents is a well-known incident.7 It can also slow down the AI learning of the company involved in the accidents. Just like in case of Uber, who had to stop testing on public roads for nine months, after the accident8.

Call for action

One of the ways to avoid these consequences is to prove that autonomous driving, at its current progressing stage is safer than manual driving. Research proves that advanced automatic safety systems (like collision warning, emergency braking) reduce crashes9. But the same cannot be; said about current state of autonomous driving technology. Study shows that most of the crashes involving autonomous vehicles were caused by humans10. However, the numbers are not statistically significant enough to conclude anything as yet.

The other way could be to work on the improvement of the technology which is not fully matured yet. But, there are technical limitations and obstacles11. Maturity can be achieved through more tests and training of AI models. This will not only take time, but the evolution may get trapped into a vicious cycle as shown in Figure 1.

Hence, the solution has to be something other than mere technological improvement.
A close observation reveals that the drivers were inattentive just prior to the accidents 12 13. This is despite the known fact that the technology is not mature and that the sensors may fail. There could be two reasons for it.

First, the marketing communication and nomenclature from the Original Equipment Manufacturers could be misleading 14. This is not an unknown phenomenon. During early days of automation in aviation industry, it was so common that a new term ‘Mode Confusion’ was created to explain limited understanding of automatic modes 15. For e.g. auto cruise mode may work perfectly on highway, but driver needs to remain attentive to check traffic signals. Failure to understand this limitation can have disastrous consequence.
Second, there could be a tendency to perceive a feature as convenience and not safety i.e. once automation in vehicles reaches higher levels and becomes reliable, drivers pay less attention to the road. For e.g. the driver in the Uber accident was watching video16. Research has proven that reaction time of drivers increase while driving (partially) autonomous vehicles17.
So, the focus should be to ensure that drivers do not lose attention because of the technology itself. This means awareness, periodic trainings and reinforcements are necessary. This is what the aviation industry also did after discovering similar issues18. This has to go beyond uploading training videos in website and sending newsletters. They should explain that the current state of  autonomous driving does not mean that it does not require driver’s attention. These technologies should be used as safety mechanism and not failsafe mechanism. A smart reward mechanism can be designed to ensure that customers understand this. An example could be an OEM launching a customer success program which will reward customers for completing trainings and reinforcements. Rewards can vary in shape and size depending on customer preference. Some may prefer returns on insurance payment, while others may prefer a ticket to tennis tournament.
To summarize, since attentive drivers could have avoided the mentioned accidents, self-driving companies should focus on improving the attention of the driver and provide clarity on scope of the technology. Once it is achieved, the accidents can be avoided and public confidence revived, enabling faster evolution of technology (Figure 2) .

Figure 1 Figure 2


  1. https://www.forbes.com/sites/cognitiveworld/2019/09/26/what-happens-with-self-driving-cars-kill-people/#3853dbdc405c↩︎

  2. https://www.bloomberg.com/news/articles/2018-03-22/video-said-to-show-failure-of-uber-s-tech-in-fatal-arizona-crash↩︎

  3. https://en.wikipedia.org/wiki/Self-driving_car↩︎

  4. https://www.forbes.com/sites/cognitiveworld/2019/09/26/what-happens-with-self-driving-cars-kill-people/#3853dbdc405c↩︎

  5. https://www.npr.org/2015/07/31/427990392/remembering-when-driverless-elevators-drew-skepticism↩︎

  6. https://en.wikipedia.org/wiki/Self-driving_car↩︎

  7. https://edition.cnn.com/2019/07/24/business/boeing-loss/index.html↩︎

  8. https://www.wired.com/story/uber-crash-elaine-herzberg-anniversary-safety-self-driving/↩︎

  9. https://www.consumerreports.org/car-safety/iihs-report-rear-crash-prevention-technology-works/↩︎

  10. https://www.axios.com/california-people-cause-most-autonomous-vehicle-accidents-dc962265-c9bb-4b00-ae97-50427f6bc936.html↩︎

  11. https://en.wikipedia.org/wiki/Self-driving_car↩︎

  12. https://www.bloomberg.com/news/articles/2018-03-22/video-said-to-show-failure-of-uber-s-tech-in-fatal-arizona-crash↩︎

  13. https://www.ntsb.gov/news/press-releases/Pages/NR20190904.aspx↩︎

  14. https://www.bbc.com/news/business-44159581↩︎

  15. https://www.consumerreports.org/cars-experts-warn-of-new-crashes-from-automated-driving-systems/↩︎

  16. https://www.axios.com/california-people-cause-most-autonomous-vehicle-accidents-dc962265-c9bb-4b00-ae97-50427f6bc936.html↩︎

  17. https://pdfs.semanticscholar.org/881e/b0b935692051430bc898fb5f67d58f2cf494.pdf?_ga=2.41876256.381489914.1568190665-1450601717.1562172122↩︎

  18. https://www.consumerreports.org/cars-experts-warn-of-new-crashes-from-automated-driving-systems/↩︎