Throughout The Car Industry
Pay attention; you're still in the driver's seat
The dominoes are falling. On May 7th, a Tesla Model S strikes a tractor-trailer in Williston, Florida. The Model S driver is killed. On June 28th, the National Highway Traffic Safety Administration (NHTSA) opens an investigation of Tesla in regards to the fatal accident; their main concern seems to be with the Automatic Emergency Braking (AEB) in Telsa’s Autopilot driver-assist system.
On May 7th, a Tesla Model S strikes a tractor-trailer in Williston, Florida. The Model S driver is killed.
On June 28th, the National Highway Traffic Safety Administration (NHTSA) opens an investigation of Tesla in regards to the fatal accident; their main concern seems to be with the Automatic Emergency Braking (AEB) in Telsa’s Autopilot driver-assist system.
By July 6th, the Security Exchange Commission (SEC) questions whether Tesla should have disclosed information about the fatality in a filing regarding a planned sale of $2 billion in stock.
On July 12th, the National Transportation Safety Board (NTSB) announces a probe into the May 7th incident, with the idea of determining whether the drive to produce autonomous autos needs to be slowed in the name of safety.
Yesterday, July 14th, Consumer Reports (CR) publicly implores Elon Musk to disable the Autopilot system that reportedlywas in use during that accident as well as at least two others.
The same day, the U.S. Senate Committee on Commerce, Science and Transportation requests, by letter, that Tesla send representatives to Washington by the end of the month to brief the committee on the details of the May 7th event , and whether Tesla failed to preducate the drivers on the Autopilot’s limitations (Consumer Reports is also concerned with that aspect as well as marketing practices).
Elon Musk seems to have become the automotive industry’s Hillary Clinton.
Tesla’s dramas and traumas are serious problems for the automaker, and the repercussions could make enough waves to upset quite a few boats in the industry. While Tesla’s stock has been doing well since June 24th, despite the stream of investigations, its continued health depends on findings and decisions yet to come. Other auto builders could find themselves and their driver-assist systems being more closely scrutinized and possibly placed under tighter regulations and higher specifications.
Mobileye, which produces Tesla’s AEB, could feel some negative reactions. The company’s Chief Communications Officer, Dan Galves,said on July 2nd, “We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.” That might be a problem for the other automakers they are now providing with tech products—Nissan and their just unveiled ProPilot—or are making deals to do so in the future—BMW.
Or is it possible that the limitations of AEB in the Tesla is not the real problem? Maybe that is why Nissan, BMW, and others are standing in line to use the technology.
One thing that must be considered in the midst of all this is whether the driver, who had to ‘opt in’, to volunteer to beta test Autopilot, was paying attention, whether he was responding to the alarms and warnings to place his hands on the steering wheel (which the semi-autonomous system requires). Yes, there is the LTAP problem with Autopilot; the fact that when the system didn’t perceive the truck as a hazard, it didn’t sound a warning, and researchers at the University of Iowa have found that an early warning helps even distracted drivers respond more quickly and avoid rear-end collisions. At the same time, the time required for the a driver to recognize the problem and react can be as little as one second or as much as 3.5 seconds.
Determining this is crucial in this inquiry; driver error can not be ruled out completely. It also raises the question of how people perceive the driver-assist system and what they think it can and expect it to do, regardless of what information and training they may be given. Remember, many others who opted in are still out there, playing by the system’s rules, so if there is a real, major problem with Autopilot, why aren’t more of them making headlines and prompting investigations?
Another important consideration here is directly related to people versus machine: Who is liable when accidents happen? That determination will be the key to the future of autonomous cars. The attorneys will become the decision makers then, for they will write the regulations for the automakers and the auto owners and the auto drivers, and you can bet that the automakers will strive mightily to not be at the front of the line when responsibility is handed out.
So, the move toward autonomous cars may be set back for a little while; it won’t be stopped or banned. That is not going to happen, but there may be a more cautious approach in development and implementation, all promoted as looking out for your safety. Actually, it will mostly be trying to prevent being held responsible if anything goes wrong… goes wrong… goes wrong.
Meanwhile, until the Jetsons hit town, it will come back to this: Keep both hands on the wheel, your eyes on the road, buckle up, and pay attention.
Tags: Tesla, Elon Musk, Autopilot, NHTSA, Consumer Reports, SEC, NTSB, U.S. Senate Committee on Commerce, Science and Transportation, Mobileye
Leave A Commment