A Few More Thoughts on the Tesla Accident

Another day passes and another batch of articles have come out regarding the recent Tesla Autopilot accident. Last week’s blog by Steve Kuciemba highlighted the potential avoidance of this crash if connected vehicle technology had been involved. I have a few additional thoughts:

  • Full autonomy (assuming it was a well-tested vehicle) could also have avoided this accident. As Chris Urmson of Google states in this article, “people were doing ridiculous things in the car” (referring to people’s response to partially automated vehicles). People assume a vehicle with some level of automation can handle a lot more than it’s actually equipped to do. This reflects people’s human nature to 1. Not have awareness/understanding of how partial automation works, and 2. Be lazy (and not focus on driving). Note: all of these news articles may be helping to address the lack of awareness issue!
  • Exactly how much testing is necessary before a vehicle is allowed on the road? Automobile manufacturers would probably argue that years of testing is required before deploying new features to the public. This New York Times article states the following: “a Tesla executive said the Autopilot system had performed safely during tens of millions of miles of driving by consumers. “It’s not like we are starting to test this using our customers as guinea pigs,” he said.” What standards are required here and how can the government determine an appropriate threshold?
  • As I’ve stated in an earlier blog post, I’ve always worried that one accident could slow (or even halt) the development and/or adoption of driverless vehicles. This article is one of many that suggests this concept. I think this would be a huge loss for society because the accidents caused by these “robo-cars” will likely be a fraction of what is currently caused by human drivers.

It looks like Tesla has no plans to disable the Autopilot feature (source). I wonder if that’s a good or bad thing…. Thoughts?

About Lauren Isaac

Lauren Isaac is the Director of Business Initiatives for the North American operation of EasyMile. Easymile provides electric, driverless shuttles that are designed to cover short distances in multi-use environments. Prior to working at EasyMile, Lauren worked at WSP where she was involved in various projects involving advanced technologies that can improve mobility in cities. Lauren wrote a guide titled “Driving Towards Driverless: A Guide for Government Agencies” regarding how local and regional governments should respond to autonomous vehicles in the short, medium, and long term. In addition, Lauren maintains the blog, “Driving Towards Driverless”, and has presented on this topic at more than 75 industry conferences. She recently did a TEDx Talk, and has been published in Forbes and the Chicago Tribune among other publications.
This entry was posted in Driverless Car Impacts, Government Considerations, Uncategorized and tagged , , , , , , . Bookmark the permalink.

2 Responses to A Few More Thoughts on the Tesla Accident

  1. Cynthia Jones says:

    I’m personally glad that Tesla is not disabling Autopilot feature; there is a lot to learn from the beta testing. It is important that Tesla is upping their communication with owners on “how Autopilot works and what drivers are supposed to do after they activate it” per WSJ article (http://www.wsj.com/articles/tesla-has-no-plans-to-disable-autopilot-feature-in-its-cars-1468340310). The death in May has people’s attention and it’s productive to further educate the Tesla drivers to improve their safety, as well as further develop this technology to improve transportation for all of us.


  2. johnc66 says:

    I have a problem with the proposition that “full autonomy could have prevented the accident”. I understand that you mean a system designed to operate under all conditions, but this definition rests upon the assumption that we know what those conditions are.

    Given the complexity of the traffic environment (including human behavior behind the wheel), you need something on the order of one trillion miles of operation on public roads before you can claim to have covered most driving scenarios. And I stress the word “most”. Ultimately, we would need tens of millions of vehicles operating for trillions of hours in autonomous mode because driving conditions involve so many variables, including the ratio of human to system-driven vehicles.

    So the answer to the deployment of automated vehicles does not lie in how many miles or hours they are driven. Rather, the answer lies in understanding the driver functions being assumed by a given system and validating that the system performs the tasks inherent in these functions appropriately.

    For example, a lane change is a fairly complex task. Done correctly, a human driver checks the rear-view mirror for approaching vehicles, assesses their distance and speed, checks the blind spot, takes into account the weather and road conditions, double-checks the forward conditions, activates the turn signal, starts the maneuver while being alert to abort if needed, and so on.

    The regulatory approach broadly consists in designing test protocols to validate the capability of a system to fulfill these driving tasks while setting diagnostic and failsafe requirements to deal with possible failure modes. These are technical requirements that seek to minimize the impact of human behavior. Brakes are designed to work safely even if the driver slams down hard; electronic stability controls correct for driver oversteer or understeer; etc.

    The fundamental flaw in the Telsa approach is the requirement that the driver continuously monitor the vehicle and traffic without designing measures to ensure this behavior into the system. In effect, you have a system that encourages the driver to “sit back and relax” while requiring that the driver remain alert and engaged. Among an array of requirements, current work on international regulations has involved extensive attention to making sure these systems detect when conditions may prevent proper system operation, ensure continuous driver availability to intervene, and provide for transition between the driver and system as warranted.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s