Jump to content

Researchers trick tesla into going 85 mph autonomously with tape on speed limit sign

spartaman64
4 hours ago, rcmaehl said:

@spartaman64

 

Stolen from Reddit:
 

 

I think its still fair considering that these older teslas are still all over the roads. iirc in 2015 and 2016 Tesla sold well over 100k cars. And unless they were wrecked they should all still be running on roads.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Sauron said:

Can we really count sign vandalism as a problem? I mean, what if it were completely spray painted black? Would a human know how fast they can go in that case?

I mean, they're not going to go 85 mph on a 35 mph road

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, gabrielcarvfer said:

They only know how to deal with stuff they've been trained to deal with. Whatever it is that deviates from the norm, *insert your favorite "instructions unclear" meme*.

Yea the A in AI is a capital letter.  It’s not actual intelligence.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, RonnieOP said:

Tbh i didnt even think they worked off speed limit signs. 

 

My GPS somehow knows the speed limit on every road im on so i figured tesla would work like that using satilites or something.

 

I wish they would ban these autopilot features on public roads. Innocent people shouldnt be at risk because tesla wants their customers to be their test animals.

 

How many people have been in accidents over autopilot now?

 

Outlaw the shit until its ironed out. 

Current self driving cars are already safer than human drivers though... 

Link to comment
Share on other sites

Link to post
Share on other sites

And this is why I classify it as driver assistance rather than autopilot. The human driver should always be aware of what's going on and be ready to take over in these edge case circumstances. It's great that these problems are being found and the systems are being improved, but ultimately if a car is going 50mph over the limit it's the fault of the squishy thing behind the wheel, and nothing else, no matter how fancy the driver aids are.

Link to comment
Share on other sites

Link to post
Share on other sites

The speed problem can easily be solved if the car monitors it's surroundings, other cars and also by double checking from the ever improving GPS data.

 

Where I see actual probelms with this is a little more down the lane with self driving, where some pricks could tape the road to make it look like lanes, or put a fake stop sign that causes the car to stop in the middle of nowhere. There are plenty of countries where doing so is extremely dangerous, especially at night and it'll take a really really long time before any AI will have the basic common sense of a human being with like 2 weeks of experiance

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, RedRound2 said:

The speed problem can easily be solved if the car monitors it's surroundings, other cars and also by double checking from the ever improving GPS data.

 

Where I see actual probelms with this is a little more down the lane with self driving, where some pricks could tape the road to make it look like lanes, or put a fake stop sign that causes the car to stop in the middle of nowhere. There are plenty of countries where doing so is extremely dangerous, especially at night and it'll take a really really long time before any AI will have the basic common sense of a human being with like 2 weeks of experiance

They already do monitor the surroundings with things like radar. So in most cases you are correct, it will be keeping a safe distance from the car in front.

 

Onto full self driving, that is something very different. We cannot rely on gps speed data as it is very often out of date, or plain incorrect. We cannot rely on signage either as we can see from that thread. Even dirty signs can cause issues, and in some parts of the world the local morons use signs as for target practice. In the future AI will have to step up to the mark but that is only as good as the data it receives. So this is one of the driving forces behind 5G. Every sign, light, junction and car will be repeating fast and live data to help run self driving. A number of cities around the world are experimenting with this as well as other data transmission system,  it it looks like 5G will be the carrier of all this data. So even temporary road works will be broadcasting. One step further plans for a standard if a car does crash or break down it too will warn other vehicles in the locality and set speeds appropriately. It all requires changes to various laws and that is a big hurdle. Some predict the technology is still over 30years away.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Sauron said:

Yes, but it's not always up to date, particularly when there are temporary changes. On the other hand you can always trust signs (if they're wrong you can contest the fine, not so with GM).

Just down the road from me is an eight mile stretch of road. Over 20 years ago it changed from 60mph down to bits at 30mph and other bits at 40mph. Still today no sat nav in any car I have driven sees those speed limits and thinks it is still a 60mph road, that includes google maps.

Link to comment
Share on other sites

Link to post
Share on other sites

This is good research. Shows that these systems are not foolproof (yet) and require further development. Someone with malicious intent could actually try to exploit similar things.

Link to comment
Share on other sites

Link to post
Share on other sites

You have speed limits in "5" increments? Why not just go to town and ask people to keep to 37.238900007 mpk/kmph?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Phill104 said:

Just down the road from me is an eight mile stretch of road. Over 20 years ago it changed from 60mph down to bits at 30mph and other bits at 40mph. Still today no sat nav in any car I have driven sees those speed limits and thinks it is still a 60mph road, that includes google maps.

Exactly ?‍♂️

11 hours ago, Shreyas1 said:

I mean, they're not going to go 85 mph on a 35 mph road

you underestimate my power - Misc - quickmeme

Jokes aside it depends on the context. If you need a sign saying 35 then the road up until that point probably had a higher speed limit, so if you see a random completely black sign you can't really guess what you're supposed to be doing. You could slow down as a precaution (as I said that would be a good behavior for the autopilot in case of uncertainty) but you couldn't really be blamed for going above 35.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, LAwLz said:

Current self driving cars are already safer than human drivers though... 

How? That makes no sense.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, RonnieOP said:

How? That makes no sense.

It does and it doesn’t.  
They’re machines.  They have perfect attention.  They don’t get distracted by phone calls or pour coffee in their lap.  They are machines though.  They can’t react and develop new concepts on the fly at all.

 

They’re safer as long as they don’t run into something they’re not prepared for in which case unusually bad things can happen.  The famous example was that tesla that hit a flatbed.  A weird edge situation.  The flatbed was parked across the road, but because it was a flatbed it had no sides for the radar to see and slow down for, so the car piled right into it.

 

So for common situations, safer.  Possibly much.  For unusual edge situations not planned for by the designers, less safe.  Also much.


I suspect statistically they have a better record, because the stupid daily stuff involving broken attention doesn’t happen.  I wouldn’t be at all surprised if the area under the curve for autonomous vehicles is higher than for hand driven ones.  The curve is a totally different shape though.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, the 737 Max is also an above average safe to fly aircraft, compared to human only systems.,. Getting it to fly well with pilots, while having a broken management/use process (it has a fatal failure mode, instead of a safe failure mode) is the difference.

 

IF autonomous cars were the only ones on the road, and all drivers were also 100% attentive, then yes, obviously 100% attention +++ additional automated saftey features is "safer". But 100% automation with everyone else acting like nutter? I'm not sold just yet. (I am on automation, automated trains don't crash. Neither to most autopilot aircraft... but the mistakes that happen around them do, and those can be institutionalised/etc)

Link to comment
Share on other sites

Link to post
Share on other sites

This isn't so much "a hack" and more "clickbait"

What's next, McAfee Advanced Threat Research saying how they can trick a user into wiping out their hard drive by running a few Windows included commands? Sheesh.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, RonnieOP said:

How? That makes no sense.

What do you mean? For every X km driven, self driving cars had fewer accidents than human driven cars. 

It's just that you hear about every single little accident involving a self driving car on the news, while nobody bothers reporting on human driver accidents because they are so incredibly common. 

Around 100 people PER DAY DIE in the US alone die from human driven cars. 

Link to comment
Share on other sites

Link to post
Share on other sites

For self driving cars, we need really clear signage and road markings. Case in point.

ʕ•ᴥ•ʔ

MacBook Pro 13" (2018) | ThinkPad x230 | iPad Air 2     

~(˘▾˘~)   (~˘▾˘)~

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, LAwLz said:

What do you mean? For every X km driven, self driving cars had fewer accidents than human driven cars. 

It's just that you hear about every single little accident involving a self driving car on the news, while nobody bothers reporting on human driver accidents because they are so incredibly common. 

Around 100 people PER DAY DIE in the US alone die from human driven cars. 

idk i feel like thats a bit skewed because usually the driver is watching along with the self driving so then theres two redundant systems looking out so ofc its more safe. and if the self driving gets into a sticky situation and gives back control and the car crashes does that count as the driver crashing or the self driving crashing i wonder. for someone to say self driving is more safe imo it needs to be a car that doesnt have a driver and it performs better than a human or otherwise its self driving+human is more safe than just human. And I think we saw in the Tesla smart summon videos that just self driving is definitely not more safe than just human driving 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×