Jump to content

Uber self-driving car ends up sideways - collision incident

1 hour ago, zMeul said:

I was sure this would pop up

being the driver's fault doesn't mean the AI driving is safe

 

in traffic, when you see some retard pushing it you would let him go ahead and distance yourself from him

since self driving are not self aware they only follow a set of rules that assumes everyone is following same rule .. you know, the law

if that AI car would've yielded and let the retard get where he wanted to go would the accident be avoided? very good possibility it would've been

The AI vehicle was going straight along the street and the driver didn't yield as he should have. This happens a lot in the Tampa/Orlando area and accidents happen way too often. Just on my way home from Global Pet Expo, there were 6 accidents on I-75 and nearby streets with two of them being fatal.

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Monkey Dust said:

It's not that safe, a truly safe car would still be the right way up.

 

Looking at the picture I'm not sure how it ended up like that? Was it the other car in the picture that failed to yield? Or was it just collected as collateral damage?

That's the dumbest thing I've heard. If a car is able to keep someone safe then it is a safe car it doesn't matter if it tipped over. I mean just because a car is safe doesn't mean it can defy the laws of physics. How do you suppose they make an untippable car? Also wouldn't the increase in deceleration experienced by the car because it didn't tip over cause more risk to the driver? It's easy for you to say I car is unsafe when you are basing your assessment on one fact. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, zMeul said:

I was sure this would pop up

being the driver's fault doesn't mean the AI driving is safe

 

in traffic, when you see some retard pushing it you would let him go ahead and distance yourself from him

since self driving are not self aware they only follow a set of rules that assumes everyone is following same rule .. you know, the law

if that AI car would've yielded and let the retard get where he wanted to go would the accident be avoided? very good possibility it would've been

I have to disagree. I suppose you haven't seen the self learning technology nvidia has shown off? They are teaching the cars to react to other unsafe drivers. If someone is driving like a dick, the car will take note of it and take maneuvers to put its self in the most safe position.

Wishing leads to ambition and ambition leads to motivation and motivation leads to me building an illegal rocket ship in my backyard.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, divito said:

And while I had my suspicions, after this, I'm convinced zMeul is a troll of the highest order.

yeah sure, based on what?!

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, christianled59 said:

I have to disagree. I suppose you haven't seen the self learning technology nvidia has shown off? They are teaching the cars to react to other unsafe drivers. If someone is driving like a dick, the car will take note of it and take maneuvers to put its self in the most safe position.

ok, so this incident happened because!?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, zMeul said:

in traffic, when you see some retard pushing it you would let him go ahead and distance yourself from him

since self driving are not self aware they only follow a set of rules that assumes everyone is following same rule .. you know, the law

if that AI car would've yielded and let the retard get where he wanted to go would the accident be avoided? very good possibility it would've been

I find it adorable that you believe that you live in a universe where human driver has never failed to take possible evasive action in response to another human's actions on the road to prevent an accident...  Cause, like, it happens all the time.  If we built a bot to scrape every news article about car crashes where one careless driver wiped out another driver who for whatever reasons failed or was unable to take evasive action, it would crash this forum ten times over.

 

I mean, my god, there are literally cases where people have crashed their vehicles to avoid squirrels.

 

http://www.ajc.com/news/crime--law/decatur-driver-swerves-avoid-squirrel-hurts-kids/ag19aUmkd4rF1Qzow3LgxL/

 

http://www.ctvnews.ca/canada/b-c-driver-hits-brakes-for-squirrel-causes-four-vehicle-crash-1.2390753

 

http://www.cbs8.com/story/22601869/driver-swerves-to-avoid-squirrel-crashes-into-bay

 

https://www.westplainsdailyquill.net/news/local/article_a7626f9a-5672-11e5-87a1-2bd0d2d6864e.html

 

See, I dunno what universe YOU live in, but I live in a universe where seemingly normal human beings, sometimes transporting their own off spring in their vehicles, crash their vehicle, bringing harm to themselves and others, s because they were afraid they'd kill a squirrel.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AshleyAshes said:

-

how off the rails you are? this much:

humans aren't perfect and that's the problem

the other problem is that AI cannot think for itself and only follows programming done by .. humans

Link to comment
Share on other sites

Link to post
Share on other sites

A well programmed computer with advanced sensors will always beat out a human 100% of the time at any given sample point. Humans make mistakes, get distracted, have lapses in judgment where a computer does not.

 

It doesn't matter if there is a mix of autonomous cars on the road and human drivers the computer controlled car will be safer, they aren't stupid they are programmed not to crash. If some idiot cuts an autonomous car off closer than it is scientifically possible to stop the car in time it's going to hit it, this is true for a human as well.

 

Autonomous cars don't just look in front they have full view of everything around them and can predict if a car is going to fail to stop or starts to move, however as above enough time is needed to do something about it and if there isn't there will be a crash.

 

Autonomous cars are not perfect, the sensors are not perfect and the programming is not perfect but it's at the point where it's a heck of a lot better than humans are.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

the other problem is that AI cannot think for itself

I am fully prepared to argue that when a human being, with their own children in the car, rolls their car into a ravine in an attempt to dodge a squirrel, that the human trait of 'thinking for themselves' is not exactly an advantage.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AshleyAshes said:

I am fully prepared to argue that when a human being, with their own children in the car, rolls their car into a ravine in an attempt to dodge a squirrel, that the human trait of 'thinking for themselves' is not exactly an advantage.

While I would love to read the argument, I think zMeul is more devoted to hating new technology using small sample sizes as data vs showing skepticism.

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, zMeul said:

humans aren't perfect and that's the problem

the other problem is that AI cannot think for itself and only follows programming done by .. humans

Actually we are very good at decision making and judgement, extremely good. Where we fail at is the ability to do this to our full capacity 100% of the time and that is where a computer surpasses a human, if we could there would be zero crashes caused by drivers and only from mechanical error.

 

AI doesn't need to 'think for itself' why would it need to? Avoiding a crash is only a math problem, as long as you can see and track everything around you all that is required to prevent a crash is a giant math crunching computer (GPU) and probability modeling of what might happen. Once you know what is happening and what might happen the car can react to that, but if I as a human decide to purposefully drive in to the AI car their isn't anything it can do to prevent me for doing so other than exploding/vaporizing.

 

Edit:

Also most of the high profile crashes of AI cars have been due to limitations in the sensors and not programming error.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ARikozuM said:

While I would love to read the argument, I think zMeul is more devoted to hating new technology using small sample sizes as data vs showing skepticism.

I think that he's attached to the romantic idea that a 'Human can always think outside of the box and improvise' while ignoring that that also means 'Improvising their car and children upside down into a ravine because they failed to account that a 4000lb vehicle vs 1lb squirrel is a victory for the vehicle in all scenarios.'

 

And I get that.  We want to think that we are smart and clever and adaptive and while humans are those things, we're also complete and utter morons.  We are reactionary and often consider very little of our available information when making snap decisions.  Particularly when we are insufficiently trained and experienced in dealing with adverse situations because we tend to be complacent with all things going well.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't understand how the self-driving thing works for Uber.  Their whole business model is that the drivers aren't working for them.  That's how they avoid paying things like insurance.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JoostinOnline said:

I don't understand how the self-driving thing works for Uber.  Their whole business model is that the drivers aren't working for them.  That's how they avoid paying things like insurance.

They still pay insurance on the car and likely pay more for it since there is no human driver.

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, JoostinOnline said:

I don't understand how the self-driving thing works for Uber.  Their whole business model is that the drivers aren't working for them.  That's how they avoid paying things like insurance.

Because a car without a driver doesn't have to be paid ANYTHING, it only has to be provided with gasoline and maintenance.  The automated car isn't concerned with paying it's rent or having enough money for groceries.  It doesn't get tired and go out of service for eight hours.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

Edit:

Also most of the high profile crashes of AI cars have been due to limitations in the sensors and not programming error.

I haven't said programming error

programming error would be when the AI was instructed to do something and did something else entirely

 

I always argued and will continue to argue that AI cars have absolutely no place on human roads

AI on specially designed transit system will almost eliminate the human factor - AI that follows same set of rules

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, zMeul said:

ok, so this incident happened because!?

In this incident, the car wasn't using that technology. Soon enough, similar technology will start being used in self-driving cars. My point is the AI can only get safer.

 

No need to be hostile. 

Wishing leads to ambition and ambition leads to motivation and motivation leads to me building an illegal rocket ship in my backyard.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, zMeul said:

I always argued and will continue to argue that AI cars have absolutely no place on human roads

AI on specially designed transit system will almost eliminate the human factor - AI that follows same set of rules

That's cool, not going to stop you from thinking that but for me personally if I was driving down the road and there was a car waiting to turn I would trust the AI car far more than a human to not turn in front of me. I'd feel safer knowing the cars around me are AI not less, not everyone feels the same and a 15 minute debate online isn't going to change that.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, AshleyAshes said:

Because a car without a driver doesn't have to be paid ANYTHING, it only has to be provided with gasoline and maintenance.  The automated car isn't concerned with paying it's rent or having enough money for groceries.  It doesn't get tired and go out of service for eight hours.

I understand that.  However, my point is that when they start providing their own vehicles, they are no longer just the middle-man.  They are the provider.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, christianled59 said:

In this incident, the car wasn't using that technology. Soon enough, similar technology will start being used in self-driving cars. My point is the AI can only get safer.

 

No need to be hostile. 

do you know why nVidia limits/controls it's AI car interaction with human drivers? because they understand the tech is not ready, maybe in the next decade

it's inevitable a crash will happen and they want to limit their exposure

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

do you know why nVidia limits/controls it's AI car interaction with human drivers? because they understand the tech is not ready, maybe in the next decade

it's inevitable a crash will happen and they want to limit their exposure

I never said the technology was ready. In fact, I said that it will get safer. 

 

 

Regardless of how dangerous self driving cars are today, they are still statistically safer than human drivers. I'm not trying to argue here lol.

Wishing leads to ambition and ambition leads to motivation and motivation leads to me building an illegal rocket ship in my backyard.

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, christianled59 said:

I never said the technology was ready. In fact, I said that it will get safer. 

 

Regardless of how dangerous self driving cars are today, they are still statistically safer than human drivers. I'm not trying to argue here lol.

statistics mean shit, people will die, safer doesn't mean no accidents

tests should not be conducted on public roads

what happened with the Tesla that went under that trailer and killed the driver? Tesla got a slap on the hand - legislation needs to evolve, there is the NTSB for aircraft related incidents, why isn't there a similar one for AI cars?

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, zMeul said:

I haven't said programming error

programming error would be when the AI was instructed to do something and did something else entirely

Programming error is also failing to take something in to account or just plain bad coding. On the point about the sensors being a large factor in the problem that means it's a hardware issue and nothing to do with AI at all.

 

That crash a while ago where the Tesla hit the truck trailer was caused by the front facing sensor being blinded, the crash would not have happened otherwise. Sun-strike for humans is rather common too and causes many accidents. The other sensor, the radar, detected the trailer height as being a road sign due to it's height and angle from the sensor.

 

Quote

Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.

https://electrek.co/2016/07/01/understanding-fatal-tesla-accident-autopilot-nhtsa-probe/

 

Either your argument is about AI not being ready, hardware not being ready or both and right now I from what I can tell you think it's the AI component where most of the issues are around the sensors and the processing power to process information.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, zMeul said:

what happened with the Tesla that went under that trailer and killed the driver? Tesla got a slap on the hand - legislation needs to evolve, there is the NTSB for aircraft related incidents, why isn't there a similar one for AI cars?

There is it's called the Police and the Serious Crash Unit or what it's locally called, NHTSA and the NTSB.

 

Quote

The National Transportation Safety Board (NTSB) is an independent U.S. government investigative agency responsible for civil transportation accident investigation. In this role, the NTSB investigates and reports on aviation accidents and incidents, certain types of highway crashes, ship and marine accidents, pipeline incidents, and railroad accidents.

 

Quote

The National Highway Traffic Safety Administration (NHTSA, pronounced "NITS-uh"[5]) is an agency of the Executive Branch of the U.S. government, part of the Department of Transportation. It describes its mission as "Save lives, prevent injuries, reduce vehicle-related crashes.

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

There is it's called the Police and the Serious Crash Unit or what it's locally called.

utter crap

Police is not equipped with diagnosing AI problems

NTSB is there to analyze and issue reports based on the findings

 

NTSB already gets involved in railroad accidents, but not self driving cars ...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×