Tesla driver killed in auto-pilot collision

Mark R

Diamond Member
Oct 9, 1999
8,513
14
81
http://www.freep.com/story/money/ca...tied-fatal-crash-nhtsa-investigates/86570046/

Tesla Motors acknowledged today that a driver of one of its Model S cars operating in Autopilot mode died when the semi-autonomous system failed to detect a tractor-trailer turning in front of the luxury electric car.

This is clearly a tragic event. I suppose that this illustrates that while the technology is "reliable" in terms of number of collisions per mile driven, it is still prone to error. In this scenario, it appears to have been at least partly attributable to lighting conditions making a semitrailer not very visible, particularly as it appears that the driver did not brake manually.

I know that some of the other car manufacturers and google have publicly stated that Tesla is irresponsible in releasing autopilot to the public before it is capable of full autonomous control. (http://electrek.co/2016/06/10/volvo-chief-tesla-autopilot-level-3-autonomous-driving/). The question now has to be whether this incident will have any greater imapct on the development of autonomous vehicle technology, and what changes regulators will make to the release of the technology to consumers.
 
Feb 25, 2011
16,822
1,493
126
I know that some of the other car manufacturers and google have publicly stated that Tesla is irresponsible in releasing autopilot to the public before it is capable of full autonomous control.

I would tend to agree. People are dumb, so "Anchorman / who's driving the winnebago" type situations are inevitable.
 

desura

Diamond Member
Mar 22, 2013
4,627
129
101
It's probably the driver's fault, but seeing as how liability is, Tesla was always taking a risk with this autopilot.

Still, apparently he ended up passing beneath a tractor trailer? Really it's more likely a Darwin award here.
 

yh125d

Diamond Member
Dec 23, 2006
6,907
0
76
Maybe the better strategy for Tesla would have been for them to enable Autopilot completely fully, but just don't yet give it access to controls, instead monitor both what it *would* do vs what the driver *does* do. Times thousands of cars and thousands of miles driven, they could gather a hell of a lot of data to help fine tune. Every time a driver does something AP wouldn't have, look into it


Or maybe they're doing exactly that too. I dunno
 

Brovane

Diamond Member
Dec 18, 2001
5,490
1,680
136
The truck driver was at fault for turning and not having enough space to complete his turn safely.

Unfortunately due to lighting conditions neither the driver nor the auto-pilot system detected the truck ahead.

Before Auto-pilot is engaged the system warns the driver that they are liable for being alert and maintaining control of the vehicle. Also the Tesla auto-pilot is just a more advanced version of cruise control systems already out their in other cars for several years. Tesla added a few additional things and named it auto-pilot. I suspect that if the person was driving a Mercedes with it's autonomous cruise control system activated and same thing would have happened we would have never heard about it.

Tesla is fairly good about reviewing accident data and looking through to see what changes can be made to the system to avoid future incidents. Just because of a couple of cars suffering under-carriage damage and their battery pack's catching fire, they added a titanium shield to the underside of all vehicles. I am sure that Tesla will review and see what adjustments can be made to the system to improve things. As with all engineering systems you discover corner case scenarios like this time to time. At least Tesla will take more pro-active action than GM did with the ignition switch issue.
 
Last edited:

HeXen

Diamond Member
Dec 13, 2009
7,832
38
91
There goes my dreams of hopping in my car and taking a nap while on my way to work. sigh
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
#1 - glad I'm not beta-testing this stuff IRL

#2 - doesn't sound like a survivable situation even without autopilot:

The male driver died in a May 7 crash in Williston, Fla., when a big rig made a left turn in front of his Tesla.

In a blog post, Tesla Motors Inc. said the 2015 car passed under the trailer, with the bottom of the trailer hitting the Model S’ windshield.

“Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said.

Seems like a sensationalist headline - a semi-truck can weigh up to 80,000 pounds legally, so if it makes a random turn in front of you in a car, what are you going to do? My heart goes out to the guy's family
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
You know, as terrible as it may sound, reading through the data, I think I still having prefer Tesla roll out beta Autopilot software than not. I see a lot of people calling Tesla irresponsible, but really, how else are you going to collect that much data? Tesla has stated that over 130 million miles have been driven with Autopilot. Statistically, it stands a much lower chance of crashing than humans do. I'd rather take a lesser chance via machine driving than a higher chance with human driving, especially if it means that this will help advanced the technology to a point in the future where it's in the "five nines" of reliability.

Granted, I don't believe it will ever get that high - too may variables & real-world conditions like this one to deal with, but still - it can be improved to the point where it helps. Kind of like doctors & medicine - we can't make you live forever or save everyone, and mistakes are made sometimes - but overall, it's worth having rather than not having, and there was a period of time when a lot of experimenting had to be done & things went really wrong before we got the data nailed down & can do things mostly right. And that's where we are at now with self-driving cars & cars with enhanced safety features like auto-braking.

In 2014, we killed 32,000 people on American roads. That's nearly as high as the Korean War! Even if we could cut that number in half through automated driving, that would be amazing.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
“Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said.

Sounds like an excuse by Tesla. Trying to make it sound like a human wouldn't have seen a truck crossing the car's path.

What was the driver doing that he did not notice the giant turning truck?

Even if we accept the dubious "white on white" story for the computer, what was the driver of the Tesla doing to fail to notice the truck at all, such that there was no braking at all?
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
You know, as terrible as it may sound, reading through the data, I think I still having prefer Tesla roll out beta Autopilot software than not. I see a lot of people calling Tesla irresponsible, but really, how else are you going to collect that much data? Tesla has stated that over 130 million miles have been driven with Autopilot. Statistically, it stands a much lower chance of crashing than humans do. I'd rather take a lesser chance via machine driving than a higher chance with human driving, especially if it means that this will help advanced the technology to a point in the future where it's in the "five nines" of reliability.

Granted, I don't believe it will ever get that high - too may variables & real-world conditions like this one to deal with, but still - it can be improved to the point where it helps. Kind of like doctors & medicine - we can't make you live forever or save everyone, and mistakes are made sometimes - but overall, it's worth having rather than not having, and there was a period of time when a lot of experimenting had to be done & things went really wrong before we got the data nailed down & can do things mostly right. And that's where we are at now with self-driving cars & cars with enhanced safety features like auto-braking.

In 2014, we killed 32,000 people on American roads. That's nearly as high as the Korean War! Even if we could cut that number in half through automated driving, that would be amazing.

Isn't the fatality rate on our roads actually quite low?

We have about 1 death for every 100M vehicle miles.

Tesla seems to match the overall rate, rather than being better?
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
Isn't the fatality rate on our roads actually quite low?

We have about 1 death for every 100M vehicle miles.

Tesla seems to match the overall rate, rather than being better?

Per the Tesla blog post:

1. Autopilot = 1 death per 130 million miles
2. United Statues = 1 death per 94 million miles
3. Worldwide = 1 death per 60 million miles

Granted, #2 & #3 are based on 100 years of data collection. Still...the robots are better than us at driving. As much as I'd hate to give up driving, if everybody had to switch & it made us significantly safer, it sounds like a good idea to me. But nobody wants "death by robot", so I understand the resistance for sure.
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
Sounds like an excuse by Tesla. Trying to make it sound like a human wouldn't have seen a truck crossing the car's path.

What was the driver doing that he did not notice the giant turning truck?

Even if we accept the dubious "white on white" story for the computer, what was the driver of the Tesla doing to fail to notice the truck at all, such that there was no braking at all?

I'm curious what the exact situation was & how they know that the driver didn't notice the truck...maybe he did & thought the car would take care of it for him? Or the truck just shot across so fast that he didn't have time to react, like if the semi did a rolling stop thinking he could make it across in time. From the blog post, this was the situation:

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S.

So the Tesla was driving down the highway & a semi-truck drove across the highway in front of the Tesla. Assuming the driver was going 65 MPH, it probably would have taken ~170 feet to stop, plus driver reaction time. Autopilot seems to work pretty well in general - video example here:

https://twitter.com/elonmusk/status/721829237741621248

Gizmodo has a bit more information:

http://gizmodo.com/fatal-tesla-crash-proves-full-autonomy-is-the-only-solu-1782923424

According to the Levy Journal police blotter, Brown’s Model S traveled beneath an 18-wheeler’s trailer that was making a left turn from a highway intersection with no stoplight.

Apparently this is what the intersection looks like:



So #1, it sounds like a crappy situation all around, #2, it was a failure of the Autopilot system to recognize the situation, and #3, it was a failure of the driver to take action (I'm guessing they pulled the data from the car & it showed no human braking activity, which is why they said the driver didn't recognize the situation either). Sad.
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
An interesting point, the driver Joshua Brown, previously reported how Autopilot actually saved his life. He even has a video he uploaded. https://www.youtube.com/watch?v=9I5rraWJq6E

Oh dang, that's too bad - he has a lot of good videos about the Autopilot system on Youtube:

https://www.youtube.com/user/NexuJosh/videos

He seems like a super conscientious guy; he even has a video showing what the Autopilot system could not handle, so he was obviously familiar with the car's systems:

https://www.youtube.com/watch?v=GaIbu7K90CA

One of his reply comments:

You are clearly doing something wrong. either you have no idea what it's doing and therefor put it in situations it should never be in or something is wrong with your car. the autopilot works great and exactly like it should. you should clearly never use it again because you're the one that is giving it a bad name. it is designed for the highway and highways with clear markings. you are talking about head on collisions and such which is silly because that can't happen on the highway. only an idiot would use this on anything other than the highway and expect perfect driving. when you drive off the highway you are simply testing things and should be extremely vigilant and know when it will fail. it's pretty easy to know when it will do well and when it will not. just understand your sensors.

Based on the Tesla post, it sounds like he just got caught in a situation that led to an unavoidable accident. RIP man
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
The truck driver was at fault for turning and not having enough space to complete his turn safely.

Unfortunately due to lighting conditions neither the driver nor the auto-pilot system detected the truck ahead.

Before Auto-pilot is engaged the system warns the driver that they are liable for being alert and maintaining control of the vehicle. Also the Tesla auto-pilot is just a more advanced version of cruise control systems already out their in other cars for several years. Tesla added a few additional things and named it auto-pilot. I suspect that if the person was driving a Mercedes with it's autonomous cruise control system activated and same thing would have happened we would have never heard about it.

Tesla is fairly good about reviewing accident data and looking through to see what changes can be made to the system to avoid future incidents. Just because of a couple of cars suffering under-carriage damage and their battery pack's catching fire, they added a titanium shield to the underside of all vehicles. I am sure that Tesla will review and see what adjustments can be made to the system to improve things. As with all engineering systems you discover corner case scenarios like this time to time. At least Tesla will take more pro-active action than GM did with the ignition switch issue.

Yeah, I like Tesla's proactive approach to resolving issues, even if it's only for the sake of maintaining their image as a newbie in the car domain. I don't know if I would buy a car with Autopilot or not (note: not because of this article). It sounds cool, for sure, but then you get comments like this one:

Keef Wivaneff8 months ago
Steve G. | OCTOBER 16, 2015
Hi, Baltimore Guy -- I just got home from my first real commute using autopilot, and I am petrified to let it drive me again, for exactly the same reason as you say:

It seems as if it's trying to get me into an accident. I don't trust it. I'm really scared now to let it drive for me.

(1) Yes, I understand it's disconcerting for folks like us who drive more on the left side of the lane, to have the car be more centered -- but that's not the issue. For 80% of the time, it's fine, and I can see the gap between my car and the lane markers when I look in my passenger side mirror. So I'm fine there.

(2) But holy crap, it's not right when I get a scary loud collision warning several times throughout my commute, and have to take control at that point 'cause I'm petrified! It literally is just as you described: it'll "hug" the right lane (when it shouldn't -- the risk is much higher on the right because there are cars and trucks there, and the autopilot is not giving them enough of a safety cushion!), and then seeming for no reason at all, just as another car is there, it will steer into the car on the right. This is crazy, I am so scared I can't trust it anymore after it happened so many times on my commute today. (Each time, I disengaged immediately, took control, and then said "OK maybe that was a fluke" and enabled it again.. only to have it happen again with a different car in a similar situation.)

Your description is exactly right -- it seems to veer towards the danger, and it's always on the right side. I don't get it! Why would it do that? Is it possible there's a bad sensor (e.g., the front-right sonar, or the right edge of the radar being clipped out for some reason) that is causing it to misjudge the situation? If the sensors don't tell the AutoPilot system that you're slowly starting to overtake a 6-ton obstacle travelling at a high rate of speed up ahead to your right, then maybe that would explain why it chooses to put you on an intercept course for that, seemingly avoiding the entirely benign center divide on your left side; such center divide being nowhere near a thread compared to this massive truck I'm about to sideswipe!).

Aaaaarrrggggh! I hate to scream on public forums, but my first day actually using it has ruined my trust of the AutoPilot system.. :-( I had it do this once on a city street the previous evening, too, but I was sure it was just a fluke and that I was just not understanding what it was doing. It similarly randomly departed the main lane as I approached an intersection, and it randomly (without warning) sharply turned my car to the right, as if it was trying to kill a pedestrian on the sidewalk -- but the collision or lane departure alert went off (not sure which it was but I know that scared the heck out of me) and I grabbed the wheel and veered back onto the main road.

I'm quite sad to say, based on my experience today, whether it's just because of a bad sensor, or a serious flaw in the design at one of several levels, this issue will surely cause a major accident in the next few days, and it will mean they have to disable the beta and we'll all lose the awesome functionality (on paper).

I have to say, for the time I was not experiencing life-and-death panic as my car drove itself across the lane divider at the worst possible times seemingly for no reason other than maybe a < should have been a > in the code somewhere -- for that time when it wasn't happening, I was quite enjoying having the car drive itself.

But .. I mean, c'mon.

Steve&#65279;

I really love the EyeSight feature in my wife's car...basically smart cruise control. It's just regular cruise control enhanced with an automatic distance system & auto-braking. Extremely useful for long highway trips, although I'm still alert when I use it for driving & braking. It's come in very handy more than once when someone has cut me off & braked when trying to say get over to an exit lane because it can react faster to hitting the brakes than I can, and sometimes stuff on the road happens in a split-second before you can even realize what is happening. Definitely a feature I want on my next car! Autopilot? Seems a bit too beta for me...
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
Per the Tesla blog post:

1. Autopilot = 1 death per 130 million miles
2. United Statues = 1 death per 94 million miles
That US statistics includes old cars, cheap cars, poorly maintained cars, compacts etc which skew the figure.

A fairer comparison would be against Mercedes S class/BMW 7 series sold in the last 2 years. If we cherry pick stats, there are vehicles that haven't had a single fatality.

http://www.iihs.org/iihs/topics/driver-death-rates
 

Brovane

Diamond Member
Dec 18, 2001
5,490
1,680
136
Yeah, I like Tesla's proactive approach to resolving issues, even if it's only for the sake of maintaining their image as a newbie in the car domain. I don't know if I would buy a car with Autopilot or not (note: not because of this article). It sounds cool, for sure, but then you get comments like this one:



I really love the EyeSight feature in my wife's car...basically smart cruise control. It's just regular cruise control enhanced with an automatic distance system & auto-braking. Extremely useful for long highway trips, although I'm still alert when I use it for driving & braking. It's come in very handy more than once when someone has cut me off & braked when trying to say get over to an exit lane because it can react faster to hitting the brakes than I can, and sometimes stuff on the road happens in a split-second before you can even realize what is happening. Definitely a feature I want on my next car! Autopilot? Seems a bit too beta for me...

The thing about Auto-pilot is it is improving literally all the time and it is learning. This same driver who tried Auto-pilot on 10/2015 might be surprised to find out how it functions in 6/2016. Not to mention their are dozens of stories of how much people love auto-pilot and how much stress it removes from their commute. I mean as long as we are swapping stories.

Because of over-the-air updates and how much the Model S and X are software driven, the car doesn't stop improving after purchase.

https://www.technologyreview.com/s/601567/tesla-tests-self-driving-functions-with-secret-updates-to-its-customers-cars/

Interesting article that talks about how important the over-the-air data pulls are for Tesla and how they are using this data to improve the car.
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
The funny thing is after every update, there will be people who say that they've noticed improvements but then other people will complain about something that's gotten worse. I suspect people who go directly from an Oct 2015 build to a June 2016 build will barely notice the difference simply because the hardware is the main limiter.

When it was first enabled, tech savvy Tesla enthusiasts found no evidence that the Model S was sending any significant amount of data or that there was any learning whatsoever. Any difference in-between updates could be chalked up drivers "learning" the limits of AutoPilot and differing environmental conditions.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Are people with auto-braking systems becoming dependent on them such that they wait for the car to do something when an emergency stop is required?

Are they hesitating to hit the brake pedal?

Are they losing a bit of reaction time?
 

Pulsar

Diamond Member
Mar 3, 2003
5,225
306
126
This is the second time that a Tesla vehicle has not detected an object because it was at windshield level.
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
Are people with auto-braking systems becoming dependent on them such that they wait for the car to do something when an emergency stop is required?

Are they hesitating to hit the brake pedal?

Are they losing a bit of reaction time?

That's a good question. A lot of the self-driving articles talk about how it's better for the system to take full control of the car because driver intervention could create a negative outcome in a situation where the computer is already taking a calculated response path. I know when I drive my wife's Forester, I'm always ready to hit the brake when I have Eyesight cruise on (granted, so far it's been 100% accurate for stopping properly when activated), but that's just because I work on computers all day & don't trust them :biggrin:
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
The trucker says that the Tesla driver was watching Harry Potter at the time of the crash, possibly on a separate device since supposedly the car's touchscreen doesn't allow video while in motion:

http://www.dailymail.co.uk/news/art...ar-recorded-near-miss-just-month-earlier.html

Sounds like the car went under the trailer, sliced the roof off completely, went a few hundred feet, and hit a telephone pole before stopping in a yard. I wonder what the programming is for that...can the car detect that it became a convertible? Does it sense that the driver isn't touching the wheel & safely pulls over? The risk this runs is that the car becomes a 4,500-pound unmanned missile on the streets. What happens if you have a heart attack while driving solo, does the car become a coffin traveling down the road by itself? Creepy to think about.
 

agent00f

Lifer
Jun 9, 2016
12,203
1,242
86
This is the second time that a Tesla vehicle has not detected an object because it was at windshield level.

That's because it's hardware limitation. Tesla's system in general is nowhere near as advanced as Google's, and not even Google trusts their cars without a driver at the wheel except at low speeds in perfect conditions.

Frankly tesla is more of a fashionable design company like apple, and simply doesn't warrant the "technology" enthusiasm they get. What I'm getting at is their key people are more Art Center than MIT.
 

A5

Diamond Member
Jun 9, 2000
4,902
5
81
That's because it's hardware limitation. Tesla's system in general is nowhere near as advanced as Google's, and not even Google trusts their cars without a driver at the wheel except at low speeds in perfect conditions.

Frankly tesla is more of a fashionable design company like apple, and simply doesn't warrant the "technology" enthusiasm they get. What I'm getting at is their key people are more Art Center than MIT.

Apple also has the best ARMv8 implementation on the planet. What a ridiculous line of criticism.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |