Self driving car kills a pedestrian

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dank69

Lifer
Oct 6, 2009
35,597
29,300
136
Who exactly is "we" and how do "they" program "their" ethics? What is "right" and what is "wrong" and do we as individuals have any say besides "well just don't drive then"?

Explain how a car will be able to make moral choices in objective reality than you or I? No AI of any magnitude in the real world has become a superior moral being. How does that work in less than vague terms? What effective philosophy will your car have?

These aren't questions I alone am asking, and no not by crockpots. This is one facet of a larger concern in terms of AI's who control us.

Can we make improvements? Sure. Is automated driving "evil"? Of course not, but I see a lack of appreciation for the complexities in more than just faster computational ability terms.

When should your car tell you when to die and why is it morally superior to you to make that decision?

Questions that need to be answered.
They might not be able to make moral choices but humans aren't going to be making many meaningful moral choices either during the few seconds of a traffic accident. Also, many humans will choose wrong. What is probably a given though is that autonomous driving will likely drastically reduce the number of deaths and injuries overall.
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
The one our laws set forth.

Frankly, it'd be foolish to design a vehicle to purposefully swerve out of control to avoid a person, even a child. You'd be putting everyone else around you at risk and still potentially failing at the original objective. OTOH, I can see future generations of this automation being capable of regaining control of the vehicle in situations that would have been impossible for humans to survive. But I digress....

You think it's wrong to have a cold, calculated, and socially agreed upon answer to no win scenarios, in exchange for saving 40,000 annually? Let's say you will not change the no win scenario, but you CAN save those 40,000 people.

That seems like a cop out, a delegation to no one for a decision that is not permitted to be made. I'm saying that improvement can be made but that retaining control to kill a crowd of people is not much of an improvement.
 

Jaskalas

Lifer
Jun 23, 2004
33,576
7,637
136
What is probably a given though is that autonomous driving will likely drastically reduce the number of deaths and injuries overall.

To go off on a flight of fancy with what we can do with the right technology....

Imagine this scenario where a cyclist cuts across the lane of traffic without warning. Today they die. But tomorrow? Imagine a world where your autonomous vehicle is connected to the internet and receives a collision warning from traffic control. Because the cyclist is wearing a cell phone, and that GPS is moving along a predicted path that crosses your road the same time you'll be there. Your automated vehicle will slow down and/or stop to avoid the collusion. A certain death today becomes avoidable tomorrow.
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
Who exactly is "we" and how do "they" program "their" ethics? What is "right" and what is "wrong" and do we as individuals have any say besides "well just don't drive then"?

Explain how a car will be able to make moral choices in objective reality than you or I? No AI of any magnitude in the real world has become a superior moral being. How does that work in less than vague terms? What effective philosophy will your car have?

These aren't questions I alone am asking, and no not by crockpots. This is one facet of a larger concern in terms of AI's who control us.

Can we make improvements? Sure. Is automated driving "evil"? Of course not, but I see a lack of appreciation for the complexities in more than just faster computational ability terms.

When should your car tell you when to die and why is it morally superior to you to make that decision?

Questions that need to be answered.

Interesting questions but given my experience on the road, I don't think humans make these snap decisions. In fact I don't think a lot of human drivers even think about what is happening to them on the road, they do as they please and make everyone else acquiesce to them. If the program was told to make the decision that took the least amount of life, I would expect the AI to react much better than a human, who is likely to choose their own life over others.
 
Reactions: Jaskalas

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
They might not be able to make moral choices but humans aren't going to be making many meaningful moral choices either during the few seconds of a traffic accident. Also, many humans will choose wrong. What is probably a given though is that autonomous driving will likely drastically reduce the number of deaths and injuries overall.

Humans will not make many purposeful choice that surface to awareness, yet the mind isn't that simplistic. A part fails, a crowd looms and into a wall the car goes. That was a response, a choice, not mulled over but made at some level. It seems the answer is to avoid the question.
 

Jaskalas

Lifer
Jun 23, 2004
33,576
7,637
136
Humans will not make many purposeful choice that surface to awareness, yet the mind isn't that simplistic. A part fails, a crowd looms and into a wall the car goes. That was a response, a choice, not mulled over but made at some level. It seems the answer is to avoid the question.

Tell me, what is the correct choice in your scenario. The wall, or the crowd?
Whatever your answer, we can make that the law, and the automation can follow that instruction, always. A human won't.

Bonus round! If a person doesn't have to drive, then they can be sealed away behind a much stronger padded barrier. Maybe they'll survive impacts that would otherwise kill them in a traditional driver's seat.
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
Interesting questions but given my experience on the road, I don't think humans make these snap decisions. In fact I don't think a lot of human drivers even think about what is happening to them on the road, they do as they please and make everyone else acquiesce to them. If the program was told to make the decision that took the least amount of life, I would expect the AI to react much better than a human, who is likely to choose their own life over others.

The AI has no self awareness, but it must make choices. I don't think I'm unique but I have had the situation arise where I risked my life to avoid killing a pedestrian. Again I didn't ponder the situation, but me, as an entity, did just that. No one else has? I have a hard time believing that.
 
Reactions: Wolverine607

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
Tell me, what is the correct choice in your scenario. The wall, or the crowd?

Past experience has shown the wall to be my choice. They don't die well and in retrospect, I am glad I took the risk. I can't make that decision for you. Should your car?
 
Reactions: Wolverine607

pmv

Lifer
May 30, 2008
13,277
8,201
136
Yes, that's correct, and generally where the idea of self driving cars is headed.

Yeah, but the last part of that transition is a huge leap. There's a qualitative difference between full autonomy and some sort of driver assistance. And, as with automation of aircraft, that transition zone is quite a dangerous one.
 
Reactions: Wolverine607

pmv

Lifer
May 30, 2008
13,277
8,201
136
And that will come, probably sooner than later. Technology like this advanced at a exponential rate. It will be only a few years before antonymous cars are considerably better drivers than even the best trained human drivers. They have a lot of advantages over humans after all. Better sensors, 360 awareness, no distractions, faster decision making, faster reactions.




The rule of no jaywalking is a logical conclusion based on the fact that a heavy vehicle traveling at speed is very limited in it's ability to stop or avoid a pedestrian that can enter the road at any point with no warning.

If you exclude the alternative, that the heavy vehicle be expected to travel at a lower speed in areas with pedestrians about. Or, indeed, that it avoid those areas entirely.
 

pmv

Lifer
May 30, 2008
13,277
8,201
136
Interesting questions but given my experience on the road, I don't think humans make these snap decisions. In fact I don't think a lot of human drivers even think about what is happening to them on the road, they do as they please and make everyone else acquiesce to them. If the program was told to make the decision that took the least amount of life, I would expect the AI to react much better than a human, who is likely to choose their own life over others.


But who is going to be buying those cars, and whose life would they wish to choose? Don't market forces have anything to do with this?

I mean, look at the VW emissions scandal. The car's computers were programmed to put the interest of the car-buyer, in getting good performance and lower fuel use, ahead of those of anyone outside the car who might want to not have to breath in toxic air. Why would driving computers be programmed with a different set of priorities in mind, if they are having to meet similar market forces?

(Clearly it depends critically on how the law addresses it...but I'm not filled with confidence on that score).
 

ch33zw1z

Lifer
Nov 4, 2004
37,995
18,344
146
Yeah, but the last part of that transition is a huge leap. There's a qualitative difference between full autonomy and some sort of driver assistance. And, as with automation of aircraft, that transition zone is quite a dangerous one.
In my stated scenario, having little driver intervention is enough.

What's the point of self driving cars if not to free the driver from driving?

This is what the goal is.

Comparing cars to aircraft has some similarities, but I'm not sure we can really do a side by side in many aspects.

I agree, it's going to challenging, life changing, civilization changing stuff...
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
The AI has no self awareness, but it must make choices. I don't think I'm unique but I have had the situation arise where I risked my life to avoid killing a pedestrian. Again I didn't ponder the situation, but me, as an entity, did just that. No one else has? I have a hard time believing that.

There is a percentage of the population that would act like that, there's also a percentage of the population who drive with disregard for any life other than their own, and in the worst cases, complete disregard for all life including their own. I'm very familiar with the latter two, I live 2 miles from the university campus here in San Antonio and I run into maybe 10-15 of these drivers every day on the drive home.

Your question is a bit of an ambiguous one, there isn't a right or wrong decision to make when the loss of life is guaranteed. There are only options and consequences, as a human we decide which consequence is more palatable. From a neutral perspective minimizing the loss of life would be the most ethical, but from a highly subjective perspective I would choose my life and my family's life over a random pedestrian. Is either better than the other? That's going to depend on which party is on the end of either set of consequences.
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
There is a percentage of the population that would act like that, there's also a percentage of the population who drive with disregard for any life other than their own, and in the worst cases, complete disregard for all life including their own. I'm very familiar with the latter two, I live 2 miles from the university campus here in San Antonio and I run into maybe 10-15 of these drivers every day on the drive home.

Your question is a bit of an ambiguous one, there isn't a right or wrong decision to make when the loss of life is guaranteed. There are only options and consequences, as a human we decide which consequence is more palatable. From a neutral perspective minimizing the loss of life would be the most ethical, but from a highly subjective perspective I would choose my life and my family's life over a random pedestrian. Is either better than the other? That's going to depend on which party is on the end of either set of consequences.

There's a song with the lyrics "if you choose to not decide you'll still have made a choice" and I submit in this context it applies. There may be no right or wrong action, or all actions may produce results that may be viewed as right AND wrong, but there is a matter of ownership for actions taken or not. You are responsible for your actions because you made them. You were there and you acted. What about when no one is there, in the sense of an action? If you for whatever reason participated in an event where life is lost our criminal justice system allows a process which is our best approximation of justice. If not criminal there may be monetary compensations for harm inflicted.

So lets say the car harms someone. If the occupant has not control he bears no responsibility any more than any non-driving passenger. Who then, the car? What justice can be had from the machine? The programmers? They designed an imperfect system so perhaps, but then they had no ability to decide at the moment of impact. The Board? I think that unlikely. Stockholders?

Who is ultimately accountable an a material way?

I think you might find that paper I linked to interesting. There's much that goes beyond getting from point A to B and I see no wisdom in pretending otherwise.
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
So lets say the car harms someone. If the occupant has not control he bears no responsibility any more than any non-driving passenger. Who then, the car? What justice can be had from the machine? The programmers? They designed an imperfect system so perhaps, but then they had no ability to decide at the moment of impact. The Board? I think that unlikely. Stockholders?

Who is ultimately accountable an a material way?

I think you might find that paper I linked to interesting. There's much that goes beyond getting from point A to B and I see no wisdom in pretending otherwise.

Accountability remains a serious concern for all forms of automation. It's a far more complex issue than safety, and has implications that spread beyond just the issue at hand. I don't doubt that a GOP controlled congress and conservative SCOTUS would take actions to protect corporate interests and shield them from legal repercussions.

The long term solution is of course to completely be rid of human driven cars and switch to full automation. That would allow the algorithms used in these cars to operate at peak efficiency and hopefully ultimately be orders of magnitude safer than human drivers. What to do during the transition is the really difficult question to answer, and it's a question our leaders need to start asking themselves now and not later. The transition to automation needs to be led by those with the public interest in mind, before the financial power is allowed to flex it's muscles and control the dialogue.
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
Accountability remains a serious concern for all forms of automation. It's a far more complex issue than safety, and has implications that spread beyond just the issue at hand. I don't doubt that a GOP controlled congress and conservative SCOTUS would take actions to protect corporate interests and shield them from legal repercussions.

The long term solution is of course to completely be rid of human driven cars and switch to full automation. That would allow the algorithms used in these cars to operate at peak efficiency and hopefully ultimately be orders of magnitude safer than human drivers. What to do during the transition is the really difficult question to answer, and it's a question our leaders need to start asking themselves now and not later. The transition to automation needs to be led by those with the public interest in mind, before the financial power is allowed to flex it's muscles and control the dialogue.

Did you happen to read the paper I linked to? Might be worth the effort to help understand the topic in more terms than automation.
 

Ichinisan

Lifer
Oct 9, 2002
28,298
1,234
136
It is inevitable that, when the tipping point is reached, roads and pedestrian walkways will be rapidly reconfigured. Pedestrians will be able to cross above or underneath intersections without any impact to traffic. They'll even walk diagonally across intersections where they currently cannot. Win for pedestrians and riders alike.

Safety++
Convenience++
Efficiency++
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
How does a human decide who to kill when here is no alternative? What does that question really have to do with self-driving cars?

So the car will run into a wall first?

Yeah that has everything to do with the discussion.
 

Ichinisan

Lifer
Oct 9, 2002
28,298
1,234
136
So the car will run into a wall first?

Yeah that has everything to do with the discussion.
It will decide based on a human's programming.

Done.

Soon after the tipping point, the number of times when such a decision will need to be made drops to 0.

You'll have mostly-automated construction of a transportation network that consists of interconnected underground grids.
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
It will decide based on a human's programming.

Done.

Soon after the tipping point, the number of times when such a decision will need to be made drops to 0.

You'll have mostly-automated construction of a transportation network that consists of interconnected underground grids.

You have begged the question. You say that there won't be problems and then reference yourself by saying there won't be problems.

You are implying infallibility, and minimizing the effects on real people. I want to meet this perfect programmer with the infallibility of God.
 

Ichinisan

Lifer
Oct 9, 2002
28,298
1,234
136
You have begged the question. You say that there won't be problems and then reference yourself by saying there won't be problems.

You are implying infallibility, and minimizing the effects on real people. I want to meet this perfect programmer with the infallibility of God.
Not infallible. Inevitable.

It's an annoying discussion. We know there will be those situations. There absolutely will be. Engineers will do their best to account for it.

Progress will happen regardless. That dilemma has been discussed to death and it's tiresome.

Soon after the tipping point and the rapid transformation of our transportation infrastructure, there will be virtually no pedestrian / vehicle interaction during a trip. It will be comparable to subway travel in that regard.

Automation will happen. Safety, efficiency, and convenience will be significantly improved for riders and for pedestrians.
 

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,266
126
Not infallible. Inevitable.

It's an annoying discussion. We know there will be those situations. There absolutely will be. Engineers will do their best to account for it.

Progress will happen regardless. That dilemma has been discussed to death and it's tiresome.

Soon after the tipping point and the rapid transformation of our transportation infrastructure, there will be virtually no pedestrian / vehicle interaction during a trip. It will be comparable to subway travel in that regard.

Automation will happen. Safety, efficiency, and convenience will be significantly improved for riders and for pedestrians.

Well I hope these programmers have deep pockets because if something fails and a loved one of mine injured I will pursue them and their companies as vigorously as a drunk behind the wheel.

Things are so much simple when all else is discarded and "well it's going to happen whether you like it or not" becomes replacement for serious concerns.

Where is my flying car again?
 
Reactions: Wolverine607
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |