Fatal crash sparks Tesla Autopilot investigation; Software is NOT bug-proof
Last updated on 3rd July 2016
Fatal crash sparks Tesla Autopilot investigation, https://www.youtube.com/watch?v=6Glf15CiEho, 1 min 15 secs, published on July 1st 2016, by CNNMoney
Ravi: I wouldn't trust any autopilot car as it will have software behind it, and, as a former software techie, I KNOW that software is surely not bug proof. Human override with software as a support system which is known to NOT be 100% bugfree is the grounded-in-reality way to use such software powered automation technology.
But will humans tend to lose sight of possibility of software failure when they try out features like this autopilot car which works most of the time, and so get fooled into a sense of false security? I think that danger is real and perhaps that's what happened in this tragedy.
Here's a related article, A Fatality Forces Tesla to Confront Its Limits, http://www.nytimes.com/2016/07/02/business/a-fatality-forces-tesla-to-confront-its-limits.html, dated July 1st 2016. The article states that a revelation was made this week that the driver was driving the Tesla car in self-driving mode, when he got killed in the fatal car crash in May this year.
The details of the accident are given here: www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html.
Ravi: A trailer truck was turning left in front of the Tesla car. The Tesla car did not stop, hit the trailer, travelled through under it, and after coming out under the trailer, veered off the road and hit two fences and a power pole.
Readers may want to view my blog posts:
1) The Without Warranty Wild West Software Industry, http://eklavyasai.blogspot.in/2013/03/the-without-warranty-wild-west-software.html, dated March 2013
2) A Debate on Warranty for Software, http://eklavyasai.blogspot.in/2013/03/a-debate-on-warranty-for-software.html, dated March 2013
================================================
In the associated Facebook post, https://www.facebook.com/ravi.s.iyer.7/posts/1760703307479572, one public comment was as follows:
93% of all auto accidents are driver related. Over 32k people are killed in the US every year in auto accidents. Compare the number of possible software glitches to human error and the perception changes. Software doesn't text while driving, drive with drug alcohol impairment, while tired, distracted, health problems....
----
I (Ravi) responded as follows:
I think, at least for me, the issue is letting the public know about the software NOT being bug-proof, and giving a falsely (marketing hype) optimistic view of the near infallibility of the auto pilot feature.
Let me share a small extract from my blog post, Truth Telling - A Tough Job, http://eklavyasai.blogspot.in/2012/10/truth-telling-tough-job.html, dated Oct. 2012:
I recently saw a few videos and read articles about how the great physicist Feynman faced the same challenges when he investigated the Challenger disaster. It was an eye-opener to me that even such a world-famous physicist had to face significant resistance from powerful administrators. If you have not seen it I recommend you see this 4 min 42 sec. video, Richard Feynman - Space Shuttle Challenger Investigation, http://www.youtube.com/watch?v=UCLgRyKvfp0. The official view now seems that Feynman did catch the real problem: http://en.wikipedia.org/wiki/Richard_Feynman#Challenger_disaster.
The wiki page above states, 'He concluded that the space shuttle reliability estimate by NASA management was fantastically unrealistic, and he was particularly angered that NASA used these figures to recruit Christa McAuliffe into the Teacher-in-Space program. He warned in his appendix to the commission's report (which was included only after he threatened not to sign the report), "For a successful technology, reality must take precedence over public relations, for nature cannot be fooled."'
My respect for the great physicist Richard Feynman went up enormously after I recently came to know of the above human goodness side and the sheer *guts* to speak out the truth in the face of powerful opposition side of him.
--- end extract from my blog post ---
You see, a life has been lost seemingly due to excess faith placed in the auto pilot feature of Tesla. That loss cannot be brushed away as a statistical probability. Who is to blame for this loss of life? - This question must be asked aggressively and the answer must be arrived at courageously and truthfully. Surely, the driver himself has to take at least part of the blame by not even attempting to brake the vehicle (unless the driver faced some medical emergency which prevented him from doing so) even as the trailer truck turned and its side was directly in front of the car's path.
For the purposes of argument let us presume, and it is a fair presumption to make given the circumstances, that the driver of the car had become negligent due to his overconfidence in the auto pilot feature of the car. Specifically, let us presume that he had taken his eyes off the road and did not even see the trailer truck turning into his car's path.
Surely, the driver is to blame in this case. But I think permission to allow such auto pilot features which are not flawless, and can cause fatal crashes, in Tesla cars sold to the general public in the USA, should not have been provided. To presume that all buyers of such cars will strictly follow Tesla's instructions and not take their eyes off the road and hands off the wheel even while the car is in auto pilot mode, is being too naive and negligent, in my considered view.
If there is a reasonable possibility that the auto pilot feature may have failures then it should not be released to the public!
What happens when a new design for a car part is incorporated in a model? Let's say a new tyre design is used. I believe that the car manufacturer would have to extensively test the tyre, perhaps as per some car industry regulator norms, and would get the permission to incorporate that tyre into its products to be sold to the general public in the USA, only if it passes all the tests.
Shouldn't there be similar tests for auto pilot cars? Of course there should be. I believe that the car/vehicle industry regulator has not been able to provide for such a test specification as the technology is very new. Well, if that's the case, the technology needs to wait for such test specifications to be in place, before that technology is provided to the general public.
Further, the industry regulator could propose safety norms like a repeated auto test of whether the driver is watching the road and has hands on the wheel, and if it is found that the driver has failed this auto test, the auto pilot switches off automatically OR brings the car to a halt if the driver does not respond to the auto pilot being switched off and manually take over the car.
Statistics about human errors in manually driven cars cannot justify introduction of auto pilot feature which is not tested well enough, into the regular consumer market, and effectively attempt to use customers as unwitting guinea-pigs with a possibility of even fatal car crashes, to discover bugs in the system!
---end my response comment---
===========================================
Some more article links on the matter:
Tesla mixes warnings and bravado about hands-free driving, http://www.deccanchronicle.com/business/autos/020716/tesla-mixes-warnings-and-bravado-about-hands-free-driving.html, dated July 2nd 2016
Tesla's statement on the matter: https://www.teslamotors.com/blog/tragic-loss
Tesla driver killed while using autopilot was watching Harry Potter, witness says, https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter, dated July 1st 2016
A Tesla Driver Died in a Crash While His Car Was on Autopilot, http://www.slate.com/blogs/future_tense/2016/06/30/a_tesla_driver_died_in_a_crash_while_his_car_was_on_autopilot.html, June 30th 2016
-------------------------------------------------
A computer scientist correspondent wrote in response to the above over email (and was OK with public sharing; slightly edited to fix typo type issues):
Feynman was great on the Challenger disaster!
The auto pilots for cars are not perfect and it is irresponsible to imply that they are. I think I mentioned that assuming that a human can take control in an emergency is foolish - a human needs time to comprehend the situation and if that situation is too complex for the computer, the human is unlikely to react appropriately and sufficiently fast.
-----
My (Ravi) response (significantly edited version of email response):
Interesting point about human car driver being unlikely to respond appropriately & fast enough in case of emergency. I guess some human car drivers, for some emergencies detected by auto pilot, may be clueless in the initial and perhaps crucial few seconds & minutes. But, perhaps in some cases, for a driver who is watching the road in front, even if the car is on auto pilot, an emergency warning may help him/her grab control and avert an accident.
Here's a 14 second youtube video, https://www.youtube.com/watch?v=MrwxEX8qOxA, published on Oct. 15th 2015 where an alert driver responds to an emergency (seems to be a failure of auto pilot) which may have led to an accident, and averts the accident with the correct manual override action. A part of the description in the youtube video page is given below which explains the scenario in the driver's own words:
I am the proud owner of a 2015 Tesla SP90D, purchased with all available options. It is the best car I have ever owned and I love it dearly. I also own a large chunk of Tesla stock. Today my car received the anticipated version 7 software update and I was anxious to try out Autopilot near my home. After several seconds of successful hands-free driving, all hell broke loose. My car was tracking the car in front of me to successfully steer, but I was going closer to the speed limit and that car eventually drifted far ahead of me. Shortly after that, another car was coming in my car's direction from the opposite side of the road. I can only guess at what happened next. My car suddenly veered to the left, crossing the double-yellow road divider line right into its path. Had I not reacted quickly to jerk the steering wheel in the opposite direction, I might have clipped it. I post this in the hopes that it will prevent any losses to anyone using Autopilot in similar circumstances and in the sincere hope that Tesla can address the issue as soon as possible if the software can somehow be improved in detecting both oncoming vehicles and cross-traffic lane dividers to avoid steering into oncoming traffic.
--- end description from youtube video link ---
-------------------------------------------------
A computer scientist correspondent wrote in response to the above over email (and was OK with public sharing; slightly edited to fix typo type issues):
Feynman was great on the Challenger disaster!
The auto pilots for cars are not perfect and it is irresponsible to imply that they are. I think I mentioned that assuming that a human can take control in an emergency is foolish - a human needs time to comprehend the situation and if that situation is too complex for the computer, the human is unlikely to react appropriately and sufficiently fast.
-----
My (Ravi) response (significantly edited version of email response):
Interesting point about human car driver being unlikely to respond appropriately & fast enough in case of emergency. I guess some human car drivers, for some emergencies detected by auto pilot, may be clueless in the initial and perhaps crucial few seconds & minutes. But, perhaps in some cases, for a driver who is watching the road in front, even if the car is on auto pilot, an emergency warning may help him/her grab control and avert an accident.
Here's a 14 second youtube video, https://www.youtube.com/watch?v=MrwxEX8qOxA, published on Oct. 15th 2015 where an alert driver responds to an emergency (seems to be a failure of auto pilot) which may have led to an accident, and averts the accident with the correct manual override action. A part of the description in the youtube video page is given below which explains the scenario in the driver's own words:
I am the proud owner of a 2015 Tesla SP90D, purchased with all available options. It is the best car I have ever owned and I love it dearly. I also own a large chunk of Tesla stock. Today my car received the anticipated version 7 software update and I was anxious to try out Autopilot near my home. After several seconds of successful hands-free driving, all hell broke loose. My car was tracking the car in front of me to successfully steer, but I was going closer to the speed limit and that car eventually drifted far ahead of me. Shortly after that, another car was coming in my car's direction from the opposite side of the road. I can only guess at what happened next. My car suddenly veered to the left, crossing the double-yellow road divider line right into its path. Had I not reacted quickly to jerk the steering wheel in the opposite direction, I might have clipped it. I post this in the hopes that it will prevent any losses to anyone using Autopilot in similar circumstances and in the sincere hope that Tesla can address the issue as soon as possible if the software can somehow be improved in detecting both oncoming vehicles and cross-traffic lane dividers to avoid steering into oncoming traffic.
--- end description from youtube video link ---
------
In response to the above comment, a user made the following public comment:
And there are hundreds of thousands of accounts of driver caused accidents. The end result is the same is it not? These are early days for this technology and it will be perfected "down the road" but people will always be people.
----
I (Ravi) responded (slightly edited):
I was unaware of how polarised this conversation can become. One young techie email correspondent got so upset with my views that he launched a verbal attack on me implying that my (publicly expressed) views are a kind-of obstacle to Tesla saving lives! I decided to disengage from that conversation as I realized that our differences were too much, and the correspondent too agreed with that!
On the other hand, three veteran computer scientist correspondents seem to have views closer to my view than this young techie correspondent.
I am still somewhat confused by this polarization. Mind you, I am not against automation per se as a technology though I do think that if automation results in significant unemployment among people then there is a social challenge there which needs to be adequately addressed.
I think my concerns come from a viewpoint of a responsible provider of software products & services (in the past in my case) and perhaps its the same with the three veteran computer scientist correspondents I mentioned earlier.
Driverless cars with very low failure rates may eventually become a superior option to human driven cars. But the question is whether the way to get to that goal can involve significant risks to human drivers; Whether such risks and any accidents that happen can be considered to be an inescapable and inevitable cost of technological progress.
I don't really know how the majority would view the matter (and it is the majority that would play a role in determining any laws that are framed on this). But my view is that in the 21st century that we live in, we should have much lesser tolerance to such risks for consumers (different from risks taken by well informed and well paid test drivers) as well as other users of the public roads (other car drivers, pedestrians etc.)
As an example of my view, here's a timely article I came across a little while back, Tesla and Google are both driving toward autonomous vehicles. Which company is taking the better route?, http://www.latimes.com/business/technology/la-fi-hy-tesla-google-20160701-snap-story.html, July 3rd 2016.
A small extract from it related to experience of Google when it provided a prototype version of an auto pilot car to its employees where the employees were supposed to keep their hands on the wheel and eyes on the road (like in the case of Tesla):
“Within about five minutes, everybody thought the car worked well, and after that, they just trusted it to work,” Chris Urmson, the head of Google’s self-driving car program, said on a panel this year. “It got to the point where people were doing ridiculous things in the car.”
After seeing how people misused its technology despite warnings to pay attention to the road, Google has opted to tinker with its algorithms until they are human-proof. The Mountain View, Calif., firm is focusing on fully autonomous vehicles — cars that drive on their own without any human intervention and, for now, operate only under the oversight of Google experts.
--- end extract from latimes.com ---
The article then states that Tesla took a different approach from Google where it released its auto pilot software to people who chose to be part of a "public beta phase"!
Ravi: Tesla took the risky path, and, frankly, put the "public beta phase" participants at some risk! Google took the safer route as it saw that even its own employees tended to become negligent when using the auto pilot feature and take their hands off the wheel and eyes off the road!
I view Google's approach as that of a more responsible provider of products and services.
But then I also realize that Tesla by taking the risk (and putting its customers at risk as well as other users of the public roads at risk) has achieved tremendous public roads experience under its belt! 130 million miles! So it has an advantage over Google. But should the USA public be willing to go with Tesla's efforts where the public is put at some risk? What if the Tesla car had crashed into a large group of people? Would that be acceptable "collateral damage" for the USA citizenry to bear, to perfect Tesla's auto pilot technology? Well, if I was a USA citizen I would not consider such risks as acceptable and would oppose permission being granted to Tesla to do "public beta" testing of its auto pilot feature on USA public roads!
-----
[I thank youtube user RockTreeStar and latimes.com, and have presumed that they will not have any objections to me sharing the above extracts from their youtube video link/website above (short extract from latimes.com), on this post which is freely viewable by all, and does not have any financial profit motive whatsoever.]
Comments
Post a Comment