Jump to content

It's happened. Self-driving Uber kills pedestrian

ucrbuffalo
On 3/19/2018 at 5:54 PM, dalekphalm said:

My point is that if the autonomous car was approaching an intersectionm 50 feet is basically nothing. The reaction time isn't the problem. It's the stopping time.

 

So, what... does that mean every time you approach an intersection, you slow down to 30 km/h (20 mph)? Because, let's be honest, if you're driving the speed limit in most cities, you're already going too fast to prevent injury, should someone unexpectedly step out in front of you.

So far, autonomous vehicles have a significantly better track record when compared with humans. Expecting better of them is not the same thing as expecting them to be infallible. That's impossible, given that they are:

True, it is mostly stopping time, but an autonomous vehicle should be able to slam the brake without losing control way harder than a human driver, due to ABS systems and such. So the likelihood of just hitting but not killing should be better due to reduced speed, if not complete stop.

 

I am also talking about the computer should be better able to "observe" the trajectory of each and every pedestrian on the road much better than a human driver, and see if pedestrian A is traveling at 5MPH towards the intersection, he or she will be likely to be in the path of the car, and thus slow down preventatively, even if it cannot fully predict if the person is going to cross or not. If a pedestrian is "dangerously close" to the intersection, it should slow down as well.

 

This is also what humans should do in traffic, it is called "defensive driving".

 

Yes, in Belgium the law says you need to be traveling at a speed that you could always stop for potential pedestrians unexpectedly crossing. What is so crazy about that? That you cant do 70 mph in a city? Too bad. Yes, the speed limit in Belgium in many cities is 18 mph (30 kph, I am not kidding), and sure, that is very slow, but it is proven to reduce traffic deaths. Sure, some adults are stupid. What about children? You cant expect a 3 year old not to run into the street after a ball. They dont know any better. This is why the speed limit near schools (even in rural areas) in Belgium is 18 mph, you HAVE TO be able to stop.

 

That is why on highways, when the likelihood of encountering pedestrians is basically zero, the speed limit is much higher.

 

On 3/19/2018 at 5:54 PM, dalekphalm said:

We don't have all the facts. We have no idea who is responsible. The driver, the car itself, the pedestrian, or possibly something else entirely.

 

What annoys me the most about this thread is people assuming facts where facts do not exist yet. Should the investigation determine that Uber, their programming, or the vehicle itself are at fault? I'll be right there with you damning them and calling for better safety.

 

But we do not know this. Uber, the driver, and the car, all may be entirely "innocent". Or guilty.

 

Why don't we wait and see the facts before passing judgement?

Im just pointing out Uber has a terrible track record on workplace practices, on abiding the law, abiding to regulations, exploiting their "independent contractor drivers" etc. etc. And Uber has been lobbying and pushing to test their self driving cars in real traffic way more aggressively than other companies, but their technology is not necessarily better than the competition.

 

On 3/19/2018 at 6:25 PM, dalekphalm said:

I'd like to hear about Norway Belgium (wrong country!), specifically. Because from the way that @maartendc stated it, it seems like any intersection, regardless of design (pre vs post automotive revolution), you must slow down to a crawl, since that's the only possible way you can cross and possibly prevent an injury when a pedestrian/cyclist unexpectedly crosses in front of you, despite it not being safe.

 

Like, seriously, people underestimate how far it takes for a car to slow down, let alone stop.

Obviously you've never been to a European city? In most cases you CANNOT drive faster than 30 mph, even if you wanted to. Streets are too narrow, too much traffic, etc.

 

Look at the image attached: a typical street in Brussels. You have to slow down every 50 meters to give the right of way at an intersection, and you'd want to be driving slowly anyway, because one of those parked cars might slam their door into you.

brussels-city-street-belgium-september-cars-typical-downtown-september-capital-largest-37347013.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/19/2018 at 7:55 PM, Dash Lambda said:

If there ever is a manslaughter case with a self-driving car (cases will happen, I'm just not sure if manslaughter will ever fit), it should NEVER go all the way back to the programmer(s) unless they intentionally put something screwy in it. That's just not how it works -You don't blame the guys in the lab, you just don't.

 

Though I do agree about the full autonomy thing. We'll never be able to make 'remove the steering wheel' autonomous cars because they can't intelligently react to adverse conditions, we can't identify a thought process and they can't find clever solutions to problems.

... That is, if we just try and build autonomous cars. Once we develop general AI, I imagine the nature of that problem will change completely. Whenever that happens.

The problem is that they're not intelligent. They're not considering all the circumstances and everything in their environment, they're following a line and reacting to ques in a very sophisticated and entirely 'dumb' way. No matter how safe or reliable they get, there will always be silly little things, maybe even hard to identify things that they can't do even when working perfectly. When those cause problems, it's a lot harder to accept than when someone screws up.

For instance: Teslas aren't good at seeing stationary objects at highway speeds. Of the three major crashes involving autopilot, one hit a stationary street cleaner on a highway, and one hit a parked fire truck at 50mph.

Hold on now, Tesla Autopilot is only Autonomous Level 2 (“barely” autonomous). Tesla Enhanced Autopilot is Autonomous Level 3 at best. 

 

So yes, there may be scenarios in which a Tesla cannot react properly. 

 

But that is a limitation of that specific autonomous system, not autonomous systems in general. 

 

A Telsa is not designed to be operated independently. The driver is supposed to always be able to take control at any time. 

 

This is quite different from the systems being tested by the likes of Uber and Google.

 

Though Telsa does claim that they will eventually be able to get their cars all the way up to Level 5. But they definitely don’t claim to be there yet. 

 

On 3/19/2018 at 8:21 PM, maartendc said:

Obviously you've never been to a European city? In most cases you CANNOT drive faster than 30 mph, even if you wanted to. Streets are too narrow, too much traffic, etc.

 

Look at the image attached: a typical street in Brussels. You have to slow down every 50 meters to give the right of way at an intersection, and you'd want to be driving slowly anyway, because one of those parked cars might slam their door into you.

1. Full disclosure, I have not - though I certainly would like to. 

2. Most North American cities, where the accident took place, are not designed like that at all. In fact it would be incredibly rare to ever encounter a city with streets like that. 

 

Its hard to make a comparison, when the design and layout is so different. 

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, mr moose said:

Not sure what your point is. Are you trying to insinuate that the level of automation is somehow inferior to humans who on the whole have so far proven to be the most inferior of all when it comes to intellectual problems?

It's just the way the information is processed. Humans are adaptable to varying conditions and can find solutions to new problems, autonomous cars only follow instructions.

Computers may very well be able to simulate human-like thought in the future, but all they can do right now is execute rigid logic. So, while autonomous cars may very well be safer than human drivers, it's bad to rely on them because they can't identify their mistakes or consider quite as many parameters.

 

10 minutes ago, dalekphalm said:

-snip-

I never said Teslas had full autonomy -but they do have some level of autonomous system and it has a silly weakness. More advanced systems have weaknesses of the same nature, though they are a lot safer.

My point is that you always need an operator who's fully aware of the environment to catch the mistakes of a tool that can only consider a set of parameters when it's performing a critical function.

"Do as I say, not as I do."

-Because you actually care if it makes sense.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dalekphalm said:

This is 100% speculation and there is no evidence to support this conclusion. 

 

How can you possibly know that if the driver had been in control, he could have avoided the accident?

 

We don’t even have remotely enough facts yet to make that conclusion (let alone just about any other conclusion that is being thrown around). 

The fact that there was a person behind the wheel could be sufficient evidence,the driver should have been able to at least slow down the car enough to avoid a fatal accident if the AI itself didn't do anything.  Driverless cars just aren't good enough yet to totally replace a human driver with other non-AI controlled vehicles on the road when the driverless car can't react how a human driver would have.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Matu20 said:

Title should be that the pedestrian steps in front of a moving car. Car was innocent.

Oh so exactly like what car manufactures did a century ago?

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think this only highlights little care for cyclists. If cyclist safety were taken seriously, bikes would 100% have dedicated paths or at a bare minimum, a physical divider protecting their traffic from car traffic.

 

If pedestrians aren't allowed on the road to walk casually with traffic, I don't see how a just as exposed human body with the equivalent of a dog's weight in metal is suddenly a "vehicle".

 

I live in Arizona and I cycle. Doesn't mean I should be ran down by 3 thousand pounds of metal because I'm on two wheels. Not an Uber or a self driving issue so much as a lack if protection for cyclists one. Not like the car launched to hunt people down from the sidewalk.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

Humans make more mistakes on the road causing significantly more damage (over 1 million deaths a year), than all the automated systems of the world combined.    Regardless of how you feel about computers controlling cars, this is one fact that will not change.

 

EDIT: in fact you can expand or contract this to account for any device that humans use, be it a tunneling machine, process machine in a factory or aircraft.  People cause more mistakes period.

That's my whole point. People are stupid. And automated self driving cars are not ready to react to stupid people.

Black Knight-

Ryzen 5 5600, GIGABYTE B550M DS3H, 16Gb Corsair Vengeance LPX 3000mhz, Asrock RX 6800 XT Phantom Gaming,

Seasonic Focus GM 750, Samsung EVO 860 EVO SSD M.2, Intel 660p Series M.2 2280 1TB PCIe NVMe, Linux Mint 20.2 Cinnamon

 

Daughter's Rig;

MSI B450 A Pro, Ryzen 5 3600x, 16GB Corsair Vengeance LPX 3000mhz, Silicon Power A55 512GB SSD, Gigabyte RX 5700 Gaming OC, Corsair CX430

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, maartendc said:

Regardless of the pedestrian or cyclist potentially being "at fault" because of crossing without looking or unexpectedly, the sensors in these self driving cars should be good enough to anticipate unexpected behavior by humans, such as unexpected crossing, and adjust their speed appropriately. If anything, it should be very easy for these systems to react in miliseconds and slow down significantly to prevent death, if not injury.

 

Not sure about the US, but In Belgium, the driver is always at fault when hitting a pedestrian or cyclist (the "weak" road users) because cars should expect unexpected behavior and mitigate their speed accordingly.

 

If we cannot do that, they should just ban them altogether. We should expect BETTEr behavior from these things than from.human drivers, nt the same "unavoidable" accident rate.

 

Not surprising that this happened to Uber. A shitty company that does not take laws or human decency seriously, all in the interest of profits and market share.

Even in the event that a collision is imminent, physics plays a role. Tires only have so much grip (with abs sacrificing some stopping distance for control as well), and cars are largely thousands of pounds of moving metal. Depending on factors such as speed, weight of the vehicle, tire specs and conditions, stopping distance can and does vary greatly.

 

Even with reflexes in the millisecond range, some minimum of time is necessary to bring the vehicle within a less-lethal speed, unless you propose a mandatory speed limit where such speeds will not occur anywhere pedestrians and cars may co-mingle, of course.

 

Regardless of what the law dictates, I believe it is the responsibility of everyone on the road, pedestrian and driver, to respect and excersise due diligence. Drivers should keep to a prudent speed in tight, populous areas while pedestrians take care to ensure the road is safe before crossing. Right of way is a bit of a stupid concept when all users of the road merely needs to respect one another. Of course, the exception being highways where speeds are much too high for anything but motor-driven vehicles could ever safely reside.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dash Lambda said:

It's just the way the information is processed. Humans are adaptable to varying conditions and can find solutions to new problems, autonomous cars only follow instructions.

Computers may very well be able to simulate human-like thought in the future, but all they can do right now is execute rigid logic. So, while autonomous cars may very well be safer than human drivers, it's bad to rely on them because they can't identify their mistakes or consider quite as many parameters.

 

So you want an inferior transport option because you don't think computers are as good as humans at abstraction.

30 minutes ago, asand1 said:

That's my whole point. People are stupid. And automated self driving cars are not ready to react to stupid people.

So you think because people are so stupid that AI can't deal with them that you would prefer these stupid people in control of the cars as well?

I can't even begin to work out how that is logical.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, mr moose said:

So you want an inferior transport option because you don't think computers are as good as humans at abstraction.

Yes, to the second part -Computers aren't just bad at abstraction, they literally can't.

I don't see how human oversight makes it inferior though. What I'm saying is there has to be a driver to at least catch the car's mistakes, because none of our current methods can create an autonomous system that doesn't make certain mistakes a good human driver wouldn't.

The only actual replacement for a human driver would be a true AI, which, again, we don't have yet and it's not certain when we will.

"Do as I say, not as I do."

-Because you actually care if it makes sense.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Matu20 said:

Title should be that the pedestrian steps in front of a moving car. Car was innocent.

People don't kill people; cars kill people.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

The one good thing about this is that there should be plenty of video and associated data from the car so we'll be able to determine exactly what happened.

Link to comment
Share on other sites

Link to post
Share on other sites

Of course it's an Uber... 

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dash Lambda said:

Yes, to the second part -Computers aren't just bad at abstraction, they literally can't.

I don't see how human oversight makes it inferior though. What I'm saying is there has to be a driver to at least catch the car's mistakes, because none of our current methods can create an autonomous system that doesn't make certain mistakes a good human driver wouldn't.

The only actual replacement for a human driver would be a true AI, which, again, we don't have yet and it's not certain when we will.

By the time a human driver discovers the situation requires intervention it is too late, and besides what you are advocating for is still a far more dangerous situation.  Computers even with what you call an inferior ability are still safer drivers than humans.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

there is just one thing that i never got in self driving cars. Let's say some small kids were playing with a ball by the side of the road, at any moment the ball could go into the road and one of them after it. If you were driving and saw them would you take especial care about, reducing speed?

I guess this is an example of something a machine can't anticipate. If they aren't on the road than it's never going to even thing of doing nothing or it would never go anywhere.

You can't program a car to anticipate some crazy lady or a kid's behavior, anticipate that they look like they may do something stupid. A human can. And that is always the week proposition on the self driving car. 

 

PS: i'm just considering a scenario, i have no idea who's at fault, and for me it could be either the car or the woman, no possibilities out of the table

 

7 hours ago, asand1 said:

What I find funny are are all the comments of "I hope they dont ban self driving cars because of this," yet so many want to ban guns because a handfull of people use them in a terrible manner.

 

Very simply put, you cant have self driving cars and pedestrians on the same road. People are unpredictable and will do stupid shit. Human drivers can be stupid too and people will always introduce a variable that the self driving cars may not be able to compensate for.

have you haver had a case where we were in doubt about: did the gun fired at the women or did the women careless get on front of the bullet?

.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, asus killer said:

there is just one thing that i never got in self driving cars. Let's say some small kids were playing with a ball by the side of the road, at any moment the ball could go into the road and one of them after it. If you were driving and saw them would you take especial care about, reducing speed?

I guess this is an example of something a machine can't anticipate. If they aren't on the road than it's never going to even thing of doing nothing or it would never go anywhere.

You can't program a car to anticipate some crazy lady or a kid's behavior, anticipate that they look like they may do something stupid. A human can. And that is always the week proposition on the self driving car. 

 

PS: i'm just considering a scenario, i have no idea who's at fault, and for me it could be either the car or the woman, no possibilities out of the table

 

have you haver had a case where we were in doubt about: did the gun fired at the women or did the women careless get on front of the bullet?

An autonomous car however could react far faster, and put the brakes on sooner....

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Dash Lambda said:

If there ever is a manslaughter case with a self-driving car (cases will happen, I'm just not sure if manslaughter will ever fit), it should NEVER go all the way back to the programmer(s) unless they intentionally put something screwy in it. That's just not how it works -You don't blame the guys in the lab, you just don't.

 

Though I do agree about the full autonomy thing. We'll never be able to make 'remove the steering wheel' autonomous cars because they can't intelligently react to adverse conditions, we can't identify a thought process and they can't find clever solutions to problems.

 

... That is, if we just try and build autonomous cars. Once we develop general AI, I imagine the nature of that problem will change completely. Whenever that happens.

The problem is that they're not intelligent. They're not considering all the circumstances and everything in their environment, they're following a line and reacting to ques in a very sophisticated and entirely 'dumb' way. No matter how safe or reliable they get, there will always be silly little things, maybe even hard to identify things that they can't do even when working perfectly. When those cause problems, it's a lot harder to accept than when someone screws up.

For instance: Teslas aren't good at seeing stationary objects at highway speeds. Of the three major crashes involving autopilot, one hit a stationary street cleaner on a highway, and one hit a parked fire truck at 50mph.

As I said, Product Liability isn't something most have deeply encountered, so few here really appreciate how deep this problem goes. If Negligence is found, the entire chain behind the car has responsibility and they're going to attach it to the most responsible group. While I doubt an individual programmer is going to ever get arrested on Manslaughter charges, an entire design team might. (Most likely it would only ever be a Civil case, but you never know what could happen. Something like "Driver-less Car kills State AG's Daughter at crosswalk" and, then, you can expect some programmers are looking at Criminal charges.)

 

Related, but this reminds me, has anyone seen them test these cars on Ice & Snow?

Link to comment
Share on other sites

Link to post
Share on other sites

Damn with all this sensationalist reporting I think it's time to start looking for autononous cars that I can "cross the road" into so I can finally have that big settlement payday xD

 

On a more serious note I can't believe the amount of critics out there who are demanding autonomous cars to be perfect. That's completely unrealistic. The standard should be that they perform better than human drivers, which anyone who has ever driven anywhere with other vehicles knows is setting the bar pretty low, but at least that is reasonable. Too many people want to make something good into the enemy here.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, maartendc said:

True, it is mostly stopping time, but an autonomous vehicle should be able to slam the brake without losing control way harder than a human driver, due to ABS systems and such. So the likelihood of just hitting but not killing should be better due to reduced speed, if not complete stop.

 

I am also talking about the computer should be better able to "observe" the trajectory of each and every pedestrian on the road much better than a human driver, and see if pedestrian A is traveling at 5MPH towards the intersection, he or she will be likely to be in the path of the car, and thus slow down preventatively, even if it cannot fully predict if the person is going to cross or not. If a pedestrian is "dangerously close" to the intersection, it should slow down as well.

Have you ever had to slam your brakes on? Like ever? Because nothing you say makes any sense. You dont lose control like you say unless road conditions are anything but dry as you still have some control. Also why bring ABS into this as whether its with a computer or not, ABS will always kick in. You act like it has to be calculated. 

 

No shit computers can calculate trajectory....HOW DO YOU THINK THESE SYSTEMS DRIVE! Its literally the back bone of the system. Calculating a person brisk walking at 5mph is not what this case is all about. Even at 5mph though, if someone walks out from the curb between two parked cars and you are within 50-60ft of a car going 30mph, you ARE going to get hit. No avoiding it. Period. 

 

And even if the system is able to "calculate the trajectory and avoid it"....these are people we are talking about it. So if can see the person running, ok so it can 1 go right (end up driving behind the person) or 2 go left (See humans like to stop when startled)....but it cant do both. So what should the car do then? Flip a coin is probably its best chances honestly because its guess is as good the humans. 

 

Computers arent magic. They can react quicker and fine tune precise movements to maybe avoid a worse situation, but no matter how much you think you can break physics, cars take time to stop. Its not hard to understand. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Dabombinable said:

An autonomous car however could react far faster, and put the brakes on sooner....

that's not the point i was making. Anticipation not reaction. A autonomous car can't anticipate, only react.

.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, ElfFriend said:

Oh so exactly like what car manufactures did a century ago?

While I appreciate the video, we're not in that era anymore. And while pedestrian have priority at some intersection, randomly stepping in a busy street nowadays without looking or waiting for cars to slow down is, at best, reckless. BTW, I'm not saying this pedestrian deserved it, far from it, but putting the whole blame on autonomous cars and insinuating the pedestrian is completely innocent is somewhat misleading.

Edited by wkdpaul

If you need help with your forum account, please use the Forum Support form !

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, asus killer said:

that's not the point i was making. Anticipation not reaction. A autonomous car can't anticipate, only react.

An autonomous car can anticipate that situation the same as any human. You anticipate that situation because you see little kids playing with something that is out of their control once it leaves their hands and you know kids (people in general) can be stupid. Those parameters can easily be added into any autonomous system too.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, asus killer said:

that's not the point i was making. Anticipation not reaction. A autonomous car can't anticipate, only react.

Anticipation would essentially lead to a car driving 5mph. If it was always anticipating it would slow down to be in full control. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, mynameisjuan said:

Have you ever had to slam your brakes on? Like ever? Because nothing you say makes any sense. You dont lose control like you say unless road conditions are anything but dry as you still have some control. Also why bring ABS into this as whether its with a computer or not, ABS will always kick in. You act like it has to be calculated. 

 

No shit computers can calculate trajectory....HOW DO YOU THINK THESE SYSTEMS DRIVE! Its literally the back bone of the system. Calculating a person brisk walking at 5mph is not what this case is all about. Even at 5mph though, if someone walks out from the curb between two parked cars and you are within 50-60ft of a car going 30mph, you ARE going to get hit. No avoiding it. Period. 

 

And even if the system is able to "calculate the trajectory and avoid it"....these are people we are talking about it. So if can see the person running, ok so it can 1 go right (end up driving behind the person) or 2 go left (See humans like to stop when startled)....but it cant do both. So what should the car do then? Flip a coin is probably its best chances honestly because its guess is as good the humans. 

 

Computers arent magic. They can react quicker and fine tune precise movements to maybe avoid a worse situation, but no matter how much you think you can break physics, cars take time to stop. Its not hard to understand. 

20-30 feet maybe. 50-60 feet is a stretch... assuming the computer starts braking the instant it sees the person, by the time it reaches that 50 foot mark it certainly won't be moving fast enough to instantly kill that person. maybe just a minor injury or even nothing. so long as the person doesn't end up under the car anyway...

 

most cars can stop from 60 mph in 110-150 feet.  It's drivers reaction time that is the problem though.

But the computer should react instantly.

 

I'd love to see what actually happened. Did the car attempt to slow down at all? did it just keep driving along and completely not *see* the person? We don't know...

 

Also, the car should track objects on the side. If it saw a person walking on the sidwalk a few hundred feet ahead... it should still assume that moving object is there by the time it reaches that point and perhaps adjust speed or prepare to stop should something surprising happen in that area.

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mynameisjuan said:

Anticipation would essentially lead to a car driving 5mph. If it was always anticipating it would slow down to be in full control. 

dude that was my point. xD

 

only a person can see a child playing by the road and decide it's a danger. A unnamed car will never be able to without some insane AI or having to go really slow whenever someone was walking in the sidewalk. 

This because the woman could have been distracted but the car will never anticipate that, will only react when she enters the road.

.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×