Jump to content

The Trolley Problem - Mercedes has their own answer regarding self driving cars

I agree with this decision.

 

If I buy a car with my own purchasing power then it better damn well put me, the owner in priority.

 

I don't put too much weight on lives that don't concern me directly unfortunately, when a situation like this ever happens (doubt it, too unlikely, computers are getting faster by the second) you would resonate my words if you were the driver.

 

Nobody I met has been heroic enough to willingly offer his lifefor 3 others, yet.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, spwath said:

The truck isnt a self driving car. He ran a red light.

 

Or a big boulder fell of a cliff in front of the car.

 

Or anything where the car would crash, killing ocupents. These situations would still arise every now and then.

to much hphetical.  Until we get real world incidents we won't know how to deal with it. 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Arty said:

to much hphetical.  Until we get real world incidents we won't know how to deal with it. 

But these are incidents the car will be programed for. If this happened, what would the car do?

n0ah1897, on 05 Mar 2014 - 2:08 PM, said:  "Computers are like girls. It's whats in the inside that matters.  I don't know about you, but I like my girls like I like my cases. Just as beautiful on the inside as the outside."

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, spwath said:

But these are incidents the car will be programed for. If this happened, what would the car do?

in this case, thier would probably be enough time to stop. 

well, instead of making up data, we can just look back at previous accident history and work that way. and teach it off those incidents. 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

A similar question was mentioned by James May during a cooking show with Richard Hammond.

Dinner in front seat, driving and a bloke suddenly appear in front.

Shall you save the bloke or save your dinner. The answer is to always save 

Spoiler

Your dinner

 

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Every time this analogy comes up I want to maximize the amount of people that dies so it really doesn't works for me.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, zMeul said:

so .. explain to me why, in the case of the Tesla that went under a trailer and killed the owner, wasn't Tesla held responsible for the AI killing the "passenger"

 

1 hour ago, Bensemus said:

Right now drivers are still responsible as the cars aren't fully autonomous, they still need oversight. So right now it would be the drivers fault as they allowed the car to get into that situation

Already answered that a few posts ago. Tesla also has a clause that usage of their beta autopilot feature still requires human oversight. The guy who died had accepted that and failed to act the way he was required to.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bensemus said:

Already answered that a few posts ago. Tesla also has a clause that usage of their beta autopilot feature still requires human oversight. The guy who died had accepted that and failed to act the way he was required to.

clause?! for a tech that can and will kill people .. please, do tell xD

the "law" didn't went after Tesla because the law regarding autonomous vehicles on public roads doesn't exist

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

clause?! for a tech that can and will kill people .. please, do tell xD

the "law" didn't went after Tesla because the law regarding autonomous vehicles on public roads doesn't exist

Tesla explicitly says the car can't handle everything and will need human oversight. Had he been paying attention like he knew he should have been he would of seen the truck the car couldn't. 

 

I'm aware that no laws have been passed specifically for autonomous vehicles. We are talking about how we think those laws should work. You are also ignoring the first bit. No self driving car for sale right now is fully autonomous. They all still require human oversight so humans are still responsible for what their car does. This applies to Tesla. Once the cars are our equal on the road they will have to be responsible for their actions. Why would you accept responsibility for injuring or killing someone when you had zero control over the situation? In an autonomous car you are a passenger. The driver, who is responsible for the car, is the code written by the manufacturer.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bensemus said:

Tesla explicitly says the car can't handle everything and will need human oversight. Had he been paying attention like he knew he should have been he would of seen the truck the car couldn't. 

 

I'm aware that no laws have been passed specifically for autonomous vehicles. We are talking about how we think those laws should work. You are also ignoring the first bit. No self driving car for sale right now is fully autonomous. They all still require human oversight so humans are still responsible for what their car does. This applies to Tesla. Once the cars are our equal on the road they will have to be responsible for their actions. Why would you accept responsibility for injuring or killing someone when you had zero control over the situation? In an autonomous car you are a passenger. The driver, who is responsible for the car, is the code written by the manufacturer.

yeah, if you ask me, no car ever will be 100% autonomous because of the current issue

autonomous cars do not belong on the same roads as pedestrians and cars driven by people - ever

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, samcool55 said:

Is it just me or am i the only one that don't think the trolley problem is an issue?

I mean save 5 and kill 1 is better than to kill 5 and save 1, no?

I agree but some people tend to explain it as something something fate something something and don't want to get involved. Apparently being a bystander is easy but getting involved to achieve the greater good is not.

 

4 hours ago, Arty said:

to much hphetical.  Until we get real world incidents we won't know how to deal with it. 

Welcome to philosophy. Enjoy your stay :P

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, ThinkWithPortals said:

I would have thought the best option would be the one that would result in the fewest casualties, so it should steer into the one person. It's the best option, from a mathematical and logical viewpoint.

by doing this and intervening you have murdered the one person. if you do nothing you have not committed murder.

 

When it comes to self driving cars; I think the ultimate decision will be made by insurance companies. They will analyse the life insurance payouts for each person and kill the least expensive

             ☼

ψ ︿_____︿_ψ_   

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Misanthrope said:

Every time this analogy comes up I want to maximize the amount of people that dies so it really doesn't works for me.

SIQltP0.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, zMeul said:

yeah, if you ask me, no car ever will be 100% autonomous because of the current issue

autonomous cars do not belong on the same roads as pedestrians and cars driven by people - ever

Why? Even in their current state all the self driving programs have proven to be safer than humans. As more cars be come autonomous it will be even safer as they are predictable while humans aren't really. They can't have any of the bad habits humans do which kill millions of people every year. Humans account for ~90% of car crashes. Why wouldn't we work to remove the most dangerous part of the car? Only Google is testing their cars in the city but they have yet to hit any pedestrians or cyclists and have only been at fault for a minor collision where the human driver also admitted to thinking the car was fine. Had both vehicles been autonomous they potentialy could of communicated or predicted better. As a cyclist I would be more comfortable sharing the road with autonomous cars as they are always paying attention. As a driver I am always nervous around cyclists as a one second lapse could cause me to hit them, especially when turning right. An autonomous car can look forwards and backwards at the same time while I can't.

 

Edited: Added some stuff about cyclists.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Bensemus said:

Why? Even in their current state all the self driving programs have proven to be safer than humans.

except when they kill people, or we should ignore that fact?!

 

people are erratic, machines are pre-programmed and will act according to their programming - when the random human element interferes, chaos follows and people can die

and we're back at the trolley problem, the Asimov Laws (or most commonly known as the Three Laws of Robotics)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

except when they kill people, or we should ignore that fact?!

..... Really?

 

Do we ignore it when other tools kill people? No. We improve the tool. 

If the car is at fault then it has a problem that needs to be fixed, same way if a human driver was at fault they need to fix what they did. Sure you can't punish the car but punishing a human doesn't undo what they did so you aren't losing anything. Insurance covers the damage like normal. The difference is the car can be improved and due to fleet learning all the cars in the fleet learn from the one's mistake. This doesn't really happen with humans.

 

6 minutes ago, zMeul said:

random human element interferes, chaos follows and people can die

and we're back at the trolley problem

The trolley problem is poor theoretical problem. When would that realistically come up? If the car was following all the rules and ended up in that situation the humans who are injured are to blame so the car should protect its passenger as the passenger did nothing wrong. If the car managed to get into that situation then it should still protect its passenger as again the passenger did nothing wrong. The car causes injuries and the insurance covers what it can and the manufacturer bears the legal responsibilities as they are the one's who are responsible. 

 

Due to computers having much better reactions they are much less likely to be caught off guard. Of course just because they can react faster doesn't mean there is a cause of action available to them to avoid an accident. They also are hard coded to follow the rules of the road so it should be basically impossible for them to be in the situation where they are at fault.

 

So far that has pretty much been true. The Tesla crash was caused by a truck cutting of the car and 13 of the 14 accidents Google have been in have all been the fault of the human drivers around the car.

 

And before you say Google's cars have caused a crash I remind you that they aren't fully autonomous yet. They still require oversight.

 

I don't understand why it has to be 100% or nothing with you and other people against autonomous cars. It reminds me of anti vaxxers if vaccines actualy did have a chance of causing autism. They would rather risk their kids life and potentialy others due to a chance that their kid may get autism, a very manageable, potentialy negligible, condition.

 

You seem like you would rather keep humans, who have proven over and over again that we are really not that great at driving, rather than replace us with a computer that while it won't be perfect, it will, based on current stats, drastically reduce car related fatalities.

 

If we assume the car was at fault and causes some deaths, overall all the autonomous cars together have saved lives. 

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Bensemus said:

..... Really?

really

 

and who's blaming the car?!

ones to held responsible are the people building and programming them

AI is only as good as the people who programs it, and since humans aren't gods and can't think of each possible scenario with variables added to them, the AI can't efficiently assess the situation

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, zMeul said:

the Trolley Problem can be presented in many different ways

for example: car is heading into a group of kids, no time to stop what do you do? steer into an incoming rig and most likely die but save the kids .. or, do nothing and most likely kill all kids

kill kids and cash into rig.

 

If you going to create a problem, CREATE A REAL ONE AT THAT!

Link to comment
Share on other sites

Link to post
Share on other sites

These ethical questions about autonomous cars and who is at fault are inherently flawed. If a kid jumped into the road ahead of an autonomous car, then the car would stop itself with auto braking, and it would tell the car behind it that it was braking so that it wouldn't get rear ended. If the car didn't have enough time to stop to save the kid, well then that's the kid's fault for running into the road. These cars can't defy the laws of physics and instantly stop as some people expect they should to be 100% perfect - people still can't run in front of them. Anytime a problem like this arises, the autonomous car couldn't have done anything more to save the person, at which point it's not the car's fault.

 

If you replace the trolley with an autonomous car deciding to kill 5 or 1, either way - it's not the car's fault. The fault lies with the person who tied those people down to the tracks, or to the people who thought it would be a good idea to sleep on the tracks. Kid runs in front of car? Kid's fault. Boulder falls from cliff and doesn't give the car time to react? There's nothing more that the car could have done to prevent death, so it's nobody's fault.

 

If the Mercedes has to choose between hitting a pedestrian, or swerving and killing the passenger - hit the pedestrian, because they are the ones who did something wrong in that situation.

 

Of course these cars will never be perfect, and accidents like the Tesla death will occur. In this case, the car would have gone haywire and would be barrelling towards a pedestrian, at which point we can only sit and watch what happens. Since we have lost control of the car at that point, why debate what the car should ethically do if we are not even sure that it would respond to the ethically correct direction? The point here is that these cars will be equal or better to humans.

._.

AMD A8-5500•••ASUS GTX 660•••MSI MS-7778 Mobo•••2x4GB DDR3 RAM•••CoolerMaster v550 Semi-Modular PSU•••1.5TB HDD•256GB Samsung 850 Evo SSD•••Optical Drive•••Phanteks Enthoo Pro

Link to comment
Share on other sites

Link to post
Share on other sites

While the moral standpoint would be to pull the lever, there are two key points that need to be made.

 

Instead of a lever that changes the path to kill 1 person on another set of tracks, a better analogy would be that the lever instead kills the person operating the lever.

 

This then begs the question, who is the car looking out for? The bystanders or the passenger? While it pains me to say this, a car's responsibility is to take care of the passengers and prioritize their lives over bystanders.

Link to comment
Share on other sites

Link to post
Share on other sites

On 14/10/2016 at 11:34 AM, zMeul said:

yeah, if you ask me, no car ever will be 100% autonomous because of the current issue

autonomous cars do not belong on the same roads as pedestrians and cars driven by people - ever

Just like no computer will ever need more than 64kb of ram...

 

I guarantee that ALL cars on highways will, by law, eventually be autonomous for safety reasons. 

Link to comment
Share on other sites

Link to post
Share on other sites

For those worried about the safety of autonomous cars:

 

"When used correctly," he wrote, Tesla's Autopilot software "is already significantly safer than a person driving by themselves and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability."

 

It doesn't have to be perfect, it just has to be better than a human, which it already is. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Grinners said:

For those worried about the safety of autonomous cars:

 

"When used correctly," he wrote, Tesla's Autopilot software "is already significantly safer than a person driving by themselves and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability."

that's why it killed the owner?

Tesla's AI failed to identify a fucking trailer and drove right under it

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, zMeul said:

that's why it killed the owner?

Tesla's AI failed to identify a fucking trailer and drove right under it

I don't even understand what you are saying or how you came to that conclusion from my post. 

 

If you refer to the part of my post that you cut out it might help you understand the point. 

 

It doesn't have to be perfect, it just has to be better than a human, which it already is. "

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Grinners said:

I don't even understand what you are saying or how you came to that conclusion from my post. 

 

If you refer to the part of my post that you cut out it might help you understand the point. 

 

It doesn't have to be perfect, it just has to be better than a human, which it already is. "

humans can distinguish between empty road and a fucking trailer - the Tesla's AI didn't

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×