Tesla driver killed in auto-pilot collision

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

HeXen

Diamond Member
Dec 13, 2009
7,832
38
91
People in the comments are quick to criticize this, but really, c'mon - what's the point of a self-driving car if you don't do something else instead of driving?

Agree, I'd want to use that extra time to take a nap, read the paper, eat my lunch or whatever. If I have to pay attention, I might as well just drive it myself plus if the car is about to crash into something, you might just think the car is about to stop on it's own at any second until it's too late by the time you do try to react.
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
So basically they are vulnerable to any T-Bone situations? I regularly drive on country roads that have 4 way intersections. People will blow through those regularly without stopping. You have to be on lookout to see if they are slowing down and use your judgement to see if you need to be ready to break.

I assume the auto detect tech wouldn't pick up on that type of situation?

Yes, that is correct. The current iteration of Tesla's Autopilot system is designed for highway use & actively avoiding rear-end collisions.



Lateral-cross detection isn't coming for a couple more years:

Mobileye, the company behind some of the sensor tech involved in the Tesla, said in a statement to Electrek that "today's collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon." It's planning to add detection for that type of incident beginning in 2018, while Tesla again pointed out that the combination of the box truck's design and color resulted in the automatic braking not firing.

Obviously it's a very tricky thing to do right; it's been mentioned that the radar stuff doesn't really work above windshield-level because it has to filter out overhead signs, reflections, lights, all that sort of stuff. There's an awful lot of information to process to avoid constant braking from false responses. One of the reasons the Autopilot didn't engage was that it was looking for a standard-size bumper instead of the extra-high ground clearance from the side of a trailer, coupled with the white color against a bright sky basically meant the system had no chance, especially if the driver was distracted watching a movie. You can see from this accident photo of a Honda Civic that it's about windshield height on the side:



Autopilot programming is unfortunately going to end up like small airplane rules...a lot of them will be written in blood. You just can't accommodate for absolutely every situation out there...in the case of Tesla's self-driving system, it expects to be driving down a highway with cars facing the same direction, not having semi-trucks be perpendicular to them on the road. Still, it seems like this obstacle should have been detected...what if there was a landslide or a hanging overpass sign or something crazy like that, you know?
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
Agree, I'd want to use that extra time to take a nap, read the paper, eat my lunch or whatever. If I have to pay attention, I might as well just drive it myself plus if the car is about to crash into something, you might just think the car is about to stop on it's own at any second until it's too late by the time you do try to react.

That's exactly it - they mentioned the accident happened in Florida, which has laws against video in view of the driver, but iirc the driver was from Ohio or someplace where there is no law on that. And again, really, who is going to buy a self-driving car just to keep their hand on the steering wheel & be alert the whole time? Kinda pointless, despite what they make you agree to when you enable the system in the car.
 

agent00f

Lifer
Jun 9, 2016
12,203
1,242
86
Apple also has the best ARMv8 implementation on the planet. What a ridiculous line of criticism.

That's not why your girlfriend buys iphones.

That hasn't been the experience of drivers using the system. No the hardware isn't the main limiter it is the software.

If hardware isn't the limitation then google wouldn't still be limited by the substantially more sophisticated hw. But it's also plausible given the replies here that tesla fans have easily meet expectations.
 

hans007

Lifer
Feb 1, 2000
20,212
17
81
That's not why your girlfriend buys iphones.



If hardware isn't the limitation then google wouldn't still be limited by the substantially more sophisticated hw. But it's also plausible given the replies here that tesla fans have easily meet expectations.

the google car has a roof mounted lidar basically for situations just like what killed this guy. the hardware AND the software are a limitation in the tesla
 

Brovane

Diamond Member
Dec 18, 2001
5,490
1,680
136
If hardware isn't the limitation then google wouldn't still be limited by the substantially more sophisticated hw. But it's also plausible given the replies here that tesla fans have easily meet expectations.

The current system is a Level-2 system, which is basically a driver aid system and not meant to be fully autonomous. The Google is going straight to fully autonomous a level-4 system. For the current level-2 system the main limitation is software. If at a future date Tesla wants to move to a level-4 system then yes hardware would be limitation.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Per the Tesla blog post:

1. Autopilot = 1 death per 130 million miles
2. United Statues = 1 death per 94 million miles
3. Worldwide = 1 death per 60 million miles

Granted, #2 & #3 are based on 100 years of data collection. Still...the robots are better than us at driving. As much as I'd hate to give up driving, if everybody had to switch & it made us significantly safer, it sounds like a good idea to me. But nobody wants "death by robot", so I understand the resistance for sure.

Too hard to tell yet with such a small sample size from Tesla, only one death registered. However, you have to look deeper than world wide or nation wide statistics. The reason is Tesla drivers are of a specific socio-economic class, are typically wealthier, likely more tech oriented, and are driving a modern car. US statistics include some guy driving a rusted out beaters with questionable brakes, etc.

The issue for me here isn't that the driver couldn't have avoided the collision anyway, its that the system didn't even try because it was blinded. Serious revision of the system is needed.

Also, they really need to drop the name "auto pilot" because that implies something that the system is not capable of.
 

agent00f

Lifer
Jun 9, 2016
12,203
1,242
86
the google car has a roof mounted lidar basically for situations just like what killed this guy. the hardware AND the software are a limitation in the tesla

Yes, the google lidar provides complete 3d imaging vs tesla's camera up front.

The current system is a Level-2 system, which is basically a driver aid system and not meant to be fully autonomous. The Google is going straight to fully autonomous a level-4 system. For the current level-2 system the main limitation is software. If at a future date Tesla wants to move to a level-4 system then yes hardware would be limitation.

That's a pretty arbitrary distinction when Tesla allows the system to be used more or less autonomously. There are simply cases where Teslas are unsafe by design, and owners will run into them as circumstances dictate.
 

Brovane

Diamond Member
Dec 18, 2001
5,490
1,680
136
That's a pretty arbitrary distinction when Tesla allows the system to be used more or less autonomously. There are simply cases where Teslas are unsafe by design, and owners will run into them as circumstances dictate.

It is basically the distinction that other manufacturers are making with their driver assist system. Currently their is other car manufactures like Mercedes Benz that have system that closely replicate what Tesla auto-pilot does.

If you think about it Tesla is engaging in more of a Agile method of development of their driver assist systems. You basically have a progression of updates that allow the system to get better and better at autonomous driving as you can off-load more work to the car. However Tesla has stressed that the driver still needs to pay attention because they are ultimately responsible. All of this is enabled by the fact that Tesla has two-way communication so they can download real time data from the cars and push updates.

Google on the other hand has very deep pockets and can afford to take the time to make the jump directly to fully autonomous vehicles. However even with all the real time testing that Google is doing they will still run into engineering corner case scenarios once they roll out some type of vehicle that is fully autonomous.

Obviously there is a argument to be made for either method of development. However we cannot expect that either made is not going to result in some fatalities either way.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Can you edit the title to be more accurate? You know, something like "driver watching a DVD at speed had the expected thing happen, happened to be using lane keeping and adaptive cruise control"?
 

agent00f

Lifer
Jun 9, 2016
12,203
1,242
86
It is basically the distinction that other manufacturers are making with their driver assist system. Currently their is other car manufactures like Mercedes Benz that have system that closely replicate what Tesla auto-pilot does.

If you think about it Tesla is engaging in more of a Agile method of development of their driver assist systems. You basically have a progression of updates that allow the system to get better and better at autonomous driving as you can off-load more work to the car. However Tesla has stressed that the driver still needs to pay attention because they are ultimately responsible. All of this is enabled by the fact that Tesla has two-way communication so they can download real time data from the cars and push updates.

Google on the other hand has very deep pockets and can afford to take the time to make the jump directly to fully autonomous vehicles. However even with all the real time testing that Google is doing they will still run into engineering corner case scenarios once they roll out some type of vehicle that is fully autonomous.

Obviously there is a argument to be made for either method of development. However we cannot expect that either made is not going to result in some fatalities either way.

M-B & such are responsible enough to the limitations of their system by not allowing it to operate hands-free.

Can you edit the title to be more accurate? You know, something like "driver watching a DVD at speed had the expected thing happen, happened to be using lane keeping and adaptive cruise control"?

The driver had a reasonable expectation that the system would work in this situation.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
The driver had a reasonable expectation that the system would work in this situation.

You mean the one that every time you enable it reminds you to keep your hands on the wheel and eyes on the road?

The one that fusses at you when it can tell your hands are off (which it can only do in turns really because it can only tell your hands are on by the extra force it has to exert to turn the wheel)

The driver had a reasonable expectation that his driver assists would magically become an autonomous car? I can only assume you've only read news articles and never actually used this feature in an actual car.
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Was it widely known that lateral crossing vehicles were a problem for Tesla's auto-pilot system?

I didn't know it, and I've read a lot about the Tesla system.

I wonder if owners are really aware of this?

I would guess not, judging by some of the videos on Youtube.

I would have assumed it would see and brake for a truck crossing it's path.

Why would you be watching a DVD and trusting the auto-pilot of a vehicle that can't do anything about vehicles crossing your path? It seems like he must have been unaware of the limitation, or he wouldn't have trusted the system?
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
Was it widely known that lateral crossing vehicles were a problem for Tesla's auto-pilot system?
How's that really different than the video shown here which praised the system for possibly preventing a crash with a left-turning vehicle:

https://www.youtube.com/watch?v=9X-5fKzmy38

If there was such an issue, Tesla should have mentioned something at that time.

Of course, a relatively slow moving semi-truck with trailer that covers the entire lane ahead of you is not what I would called a lateral crossing problem.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Was it widely known that lateral crossing vehicles were a problem for Tesla's auto-pilot system?

I didn't know it, and I've read a lot about the Tesla system.

I wonder if owners are really aware of this?

I would guess not, judging by some of the videos on Youtube.

I would have assumed it would see and brake for a truck crossing it's path.

Why would you be watching a DVD and trusting the auto-pilot of a vehicle that can't do anything about vehicles crossing your path? It seems like he must have been unaware of the limitation, or he wouldn't have trusted the system?

There is an awful lot it can't do. That's why it is billed as an assist, not an autonomous car. It also can't handle coming up upon completely stopped cars well (via radar, they're going to look exactly like cars on the side of the parked on the side of road). It can't handle stop signs. It can't handle traffic lights. It can't handle road debris, etc. It doesn't know how to handle the end of a lane when multiple lanes become one. Shall I go on? The list is enormous since it's an assist and not an autonomous car.

I have one. A rational individual cannot mistake it for an autonomous car, nor is it advertised as one. What there are a lot of though are irresponsible media stories that don't take this in to account (in regards to reviews and such, not this incident).
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
There is an awful lot it can't do. That's why it is billed as an assist, not an autonomous car. It also can't handle coming up upon completely stopped cars well (via radar, they're going to look exactly like cars on the side of the parked on the side of road). It can't handle stop signs. It can't handle traffic lights. It can't handle road debris, etc. It doesn't know how to handle the end of a lane when multiple lanes become one. Shall I go on? The list is enormous since it's an assist and not an autonomous car.

I have one. A rational individual cannot mistake it for an autonomous car, nor is it advertised as one. What there are a lot of though are irresponsible media stories that don't take this in to account (in regards to reviews and such, not this incident).

Well, this driver was supposedly very familiar with the system, and demonstrated it on Youtube often...so I don't understand how that fits in with watching Harry Potter on a portable DVD player while flying down the highway, and not even attempting to brake when a truck crosses your path.

He must not really have known all that much about the system?

http://www.nytimes.com/2016/07/02/b...nthusiast-tested-the-limits-of-his-tesla.html

We have a claim in there, from the driver apparently, that the car swerved to the right, to avoid a rear end collision, rather than just braking. I didn't know it would turn the wheel to avoid other vehicles.

I suspect that this driver tested the system one too many times, and waited to see what it would do when the truck turned? And was shocked into immobility when it didn't do anything?
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Mr. Brown was particularly interested in testing the limits of the Autopilot function, documenting how the vehicle would react in blind spots, going around curves and other more challenging situations.

&#8220;This section in here is going to be very, very difficult for the car to handle,&#8221; he said in one video, posted in October, as his vehicle rounded a curve. &#8220;We&#8217;re filming this just so you can see scenarios where the car does not do well.&#8221;

Mark Vernon, a high school classmate who recalled tinkering with electronics in shop class together, said that his friend showed off the self-driving feature on a recent visit at Mr. Brown&#8217;s home.

&#8220;He knew the hill that it would give up on, because it couldn&#8217;t see far enough,&#8221; Mr. Vernon said. &#8220;He knew all the limitations that it would find and he really knew how it was supposed to work.&#8221;

From the NYT article about the guy being very experienced with the system.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
From the NYT article about the guy being very experienced with the system.

You realize that this just affirms that the guy darwin awarded himself. He of all people should understand that you never stop monitoring this.

This is the internet though, where millions of people who have read an article about a thing know more than people who have it in their lives and use it on a consistent basis, so by all means, keep on telling me how a rational individual could confuse a Tesla for an autonomous car, and how it is somehow a piece of technology's fault when he died from watching a dvd instead of the road.
 

desura

Diamond Member
Mar 22, 2013
4,627
129
101
Watching a movie is seriously tempting fate though. There are also videos of guys sleeping in their teslas, though this is in stop and go traffic.

I don't know how Tesla could make a fix for this one. That is also the height for road signs, and an overly aggressive algorithm would result in stopping...in the middle of the highway.

THe best solution probably is for trucks like that to install crash guards on the side. The thing is that the crash guards also improve fuel efficiency by streamlining the airstream, so they wouldn't cost anything. But industry is stubborn as always.
 

agent00f

Lifer
Jun 9, 2016
12,203
1,242
86
You mean the one that every time you enable it reminds you to keep your hands on the wheel and eyes on the road?

The one that fusses at you when it can tell your hands are off (which it can only do in turns really because it can only tell your hands are on by the extra force it has to exert to turn the wheel)

The driver had a reasonable expectation that his driver assists would magically become an autonomous car? I can only assume you've only read news articles and never actually used this feature in an actual car.

The car clearly works most of the time when it allows you to take a nap, so it's entirely reasonable for most people who aren't tech experts to assume it'll continue to do so.

Also off the top the of my head, using torque sensing instead of (capacitive) proximity seems a terrible idea.
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
48,518
5,340
136
A rational individual cannot mistake it for an autonomous car, nor is it advertised as one.

That's the thing though. When I've discussed it with my techie friends, they understand the limitations. When I've discussed it with my non-techie friends, they put "expensive electric car" & the word "Autopilot" together and don't understand why a fancy "self-driving" car didn't see a ginormous semi-truck in the way. Tesla is a victim of a misnomer; they should have called it "Smart Cruise" or something like that.

If you take a step back, selling a car with the word Autopilot as a feature & then having it turn some guy's car into a convertible because it couldn't see a 50+ foot truck trailer is kind of ridiculous.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
You realize that this just affirms that the guy darwin awarded himself. He of all people should understand that you never stop monitoring this.

This is the internet though, where millions of people who have read an article about a thing know more than people who have it in their lives and use it on a consistent basis, so by all means, keep on telling me how a rational individual could confuse a Tesla for an autonomous car, and how it is somehow a piece of technology's fault when he died from watching a dvd instead of the road.

Well, with his experience with and knowledge of the vehicle, if he thought he could watch a DVD instead of drive, what does that say? He had a lot of miles with the system.

What are noobs going to be trying? Sleeping in the back seat?

A model X may have crashed while in Autopilot mode last week.

http://www.freep.com/story/money/ca...-gallery-owner-survives-tesla-crash/86712884/

A Southfield art gallery owner told police his 2016 Tesla Model X was in Autopilot mode when it crashed and rolled over on the Pennsylvania Turnpike last week.
In his crash report, Vukovich stated that Scaglione's car was traveling east near mile marker 160, about 5 p.m. when it hit a guard rail "off the right side of the roadway. It then crossed over the eastbound lanes and hit the concrete median."
After that, the Tesla Model X rolled onto its roof and came to rest in the middle eastbound lane.

Of course, AP could just be a convenient excuse, but it will be easy to tell if it's true.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |