Jump to content

AI can predict who will be criminals based on facial features

patrick3027

I'm not a rocket scientist, but it's very evident that they're criminals cause their picture is a goddamn mugshot.

 

PS: everyone is a criminal.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Ryan_Vickers said:

I just pulled those numbers out of the air so don't worry too much about that :P

All I'm saying is, if they can get it close to 100%, which (if there is a correlation to be found) is possible (in theory), then it wouldn't take a sample the size of the world to do it, and some government might just implement this as a tool.  Scary notion, imo...

The authors claimed 89.51% accuracy on their best model. Your made up numbers were much less scary.

24 minutes ago, patrickjp93 said:

I don't think it would lead to persecution so much as it would just make surveillance more sensitive and targeted. Obviously, no one should be harassed or accused until they actually commit a crime,  but you can't have good proactivity in law enforcement without some guidelines.

Maybe not, I am cynical. There is already abuse of power, and a computer model outputting conjecture of criminality on individuals could feed this issue, as the database on persecuted 'criminals' grows, the model could get better and better at identifying them.

I would rather see effort into improving the accuracy of law enforcement than increasing use of inaccurate tools.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, DrMikeNZ said:

The authors claimed 89.51% accuracy on their best model. Your made up numbers were much less scary.

Maybe not, I am cynical. There is already abuse of power, and a computer model outputting conjecture of criminality on individuals could feed this issue, as the database on persecuted 'criminals' grows, the model could get better and better at identifying them.

I would rather see effort into improving the accuracy of law enforcement than increasing use of inaccurate tools.

A 90% accurate tool is better than no tool at all, and perfection is, dare I say, impossible.

 

The model will improve, and having a larger study sample will improve the precision.

 

And I think identifying those at higher risk of committing crimes is valuable, just like concentrating police patrols in poor neighborhoods in inner cities. Sure, it doesn't catch everything, but it will improve law enforcement.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If we put a politician photo in, will it explode? 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Mooshi said:

If we put a politician photo in, will it explode? 

Hillary Clinton - Likelihood: 0% ;)

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, patrickjp93 said:

A 90% accurate tool is better than no tool at all, and perfection is, dare I say, impossible.

 

The model will improve, and having a larger study sample will improve the precision.

 

And I think identifying those at higher risk of committing crimes is valuable, just like concentrating police patrols in poor neighborhoods in inner cities. Sure, it doesn't catch everything, but it will improve law enforcement.

Improving precision does not improve accuracy, and there is no guarantee that the model will be able to be improved particularly as there is no evidence of causality.

 

If you imagine a hypothetical country of 4.471 million people, with 80,000 adults charged with crimes per year, and a conviction rate of 76% of which 11% were serious crimes resulting in a prison sentence.

Of the 60,800 convicted criminals, 6,080 would have been missed by the computer model. Additionally, due to the models false positive rate around 440,000 innocent people would have been identified by the model as potential criminals.

Investigating around 500,000 people because 12% of them statistically make up 90% of convicted criminals is wasted resources.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, DrMikeNZ said:

Improving precision does not improve accuracy, and there is no guarantee that the model will be able to be improved particularly as there is no evidence of causality.

 

If you imagine a hypothetical country of 4.471 million people, with 80,000 adults charged with crimes per year, and a conviction rate of 76% of which 11% were serious crimes resulting in a prison sentence.

Of the 60,800 convicted criminals, 6,080 would have been missed by the computer model. Additionally, due to the models false positive rate around 440,000 innocent people would have been identified by the model as potential criminals.

Investigating around 500,000 people because 12% of them statistically make up 90% of convicted criminals is wasted resources.

Far less wasted than what we have now.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Ryan_Vickers said:

Well, if it's 100% accurate (never picks an innocent person as a criminal or vise versa) then this would be fantastic, but the obvious problem is when has anything ever been absolutely 100% correct, every time?  The potential for misuse seems all too obvious, but I highly doubt any government would be crazy enough to put this into practice using it to predict who might commit a crime and arresting them ahead of time... right? :S 

 

No, it has the potential to be quite disastrous even in the best case scenario of 100% accuracy.  For example, poorer neighborhoods tend to have higher populations of minorities, so the authorities could use this to needlessly pursue petty crimes as a way to garner more revenue for the city.  Minorities tend to be of lower income and would be less likely to fight charges in court.  If the fine cannot be paid, that person will end up sitting in jail simply for not having the ability to pay.  This will ultimately defeat the purpose of the algorithm in the first place.

 

Furthermore, if such a program is implemented only in poor areas, statistics will be biased against a particular race or income level of people rather than solely violent individuals who pose a genuine danger.

My PC specifications are in my profile.

Link to comment
Share on other sites

Link to post
Share on other sites

Great. We have governments using 1984 as a manual, not a warning, and now we have the potential for pre-crimes arrests from The Matrix.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Ryan_Vickers said:

unless it actually turns out to be reliable, in which case, that would be very interesting and would have implications that warrant further investigation :S 

It can't, and any worthy data scientists knows it. Why may you ask? Because the samples you give to the algorithm are inherently flawed, since you are talking about convicted criminal and on convicted. We all know that the biggest criminal of all time haven't been convicted ever for their crimes. (Example of that: some politicians)

Based on that it only could classify criminals based on "are they the same type of criminal as those guys? ".

Other argument is that... well... there is more than just your face to determine anything about what's inside someone's head. Because of that you perfectly know that the algorithm can't be really accurate with something other than luck or bias confirmation from the actual police which would convict innocent people. The model used to classify people would do it only based on an image of a face, because it learns on that, therefore it can't find the truth accurately.

Link to comment
Share on other sites

Link to post
Share on other sites

Great one step closer to Psycho Pass.

How can this not lead to discrimination, which ultimately leads to pushing soon-to-be criminals into becoming actual criminals? 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, laminutederire said:

It can't, and any worthy data scientists knows it. Why may you ask? Because the samples you give to the algorithm are inherently flawed, since you are talking about convicted criminal and on convicted. We all know that the biggest criminal of all time haven't been convicted ever for their crimes. (Example of that: some politicians)

Based on that it only could classify criminals based on "are they the same type of criminal as those guys? ".

Other argument is that... well... there is more than just your face to determine anything about what's inside someone's head. Because of that you perfectly know that the algorithm can't be really accurate with something other than luck or bias confirmation from the actual police which would convict innocent people. The model used to classify people would do it only based on an image of a face, because it learns on that, therefore it can't find the truth accurately.

You think there can't be facial similarities between politicians and convicted criminals? You've committed premature pessimisation without any factual backing.

 

Also, the algorithm wouldn't be responsible for conviction. It would be responsible for targeting surveillance and police resources. Obviously no one should be arrested or convicted without a crime being committed, but having police presence reduces crime, period.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, patrickjp93 said:

You think there can't be facial similarities between politicians and convicted criminals? You've committed premature pessimisation without any factual backing.

 

Also, the algorithm wouldn't be responsible for conviction. It would be responsible for targeting surveillance and police resources. Obviously no one should be arrested or convicted without a crime being committed, but having police presence reduces crime, period.

I think I am entitled to be skeptic. First of all, apparences can lie. You can potentially try to alter it. Besides, a face doesn't tell you everything about a perso as a human being. That's the whole point of talking to people.

What's your factual backing that it can be accurate?

And how was it not backed by anything? I specifically told that it only classify with convicted criminal, which is a strong bias, because the criminals you really want to catch are those who haven't been convicted. It just participate in putting the pressure on the same class of people, forgetting the rest.

What would happen if used ubiquitously ? You really think that won't be used to unfairly convict people? 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, laminutederire said:

I think I am entitled to be skeptic. First of all, apparences can lie. You can potentially try to alter it. Besides, a face doesn't tell you everything about a perso as a human being. That's the whole point of talking to people.

What's your factual backing that it can be accurate?

And how was it not backed by anything? I specifically told that it only classify with convicted criminal, which is a strong bias, because the criminals you really want to catch are those who haven't been convicted. It just participate in putting the pressure on the same class of people, forgetting the rest.

What would happen if used ubiquitously ? You really think that won't be used to unfairly convict people? 

True, but most people don't have the money for that.

 

A master's-level education in augmented reality, game theory, statistics, AI, and visualization algorithms.

 

It's not a strong bias. It's not a bias at all actually.

 

You're conflating your ideas of use with the actual intended use. Further, you forget the people who haven't been caught have visual similarities (dead eyes anyone?) with convicted criminals.

 

No it doesn't.

 

No, because having a predisposition to crime isn't even circumstantial evidence that can be used to put forward a prediction in any court in the U.S.. I don't know if the backwards Europeans or Asians want to use it that way, but it can't be used that way in the U.S..

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, patrickjp93 said:

True, but most people don't have the money for that.

 

A master's-level education in augmented reality, game theory, statistics, AI, and visualization algorithms.

 

It's not a strong bias. It's not a bias at all actually.

 

You're conflating your ideas of use with the actual intended use. Further, you forget the people who haven't been caught have visual similarities (dead eyes anyone?) with convicted criminals.

 

No it doesn't.

 

No, because having a predisposition to crime isn't even circumstantial evidence that can be used to put forward a prediction in any court in the U.S.. I don't know if the backwards Europeans or Asians want to use it that way, but it can't be used that way in the U.S..

You can fake a facial expression if you know what to do. Money isn't a parameter for that.

 

Okay, then why my master's level education wouldn't count as factual backing?

 

How is it not a bias? You know perfectly well that convicted criminal are only the ones you caught, and/or the one who couldn't afford to defend themselves in cases of light crimes. That creates a bias against poorer criminals.

 

Are they criminal because of their facial similarities? I don't think so. At best, the similarity is their education, which can be more similar based on ethnicity or family, both of which can induce physical ressemblance.

 

So you say with one the most unfair system of justice? (Bail fees for instance make for unfair process)

Link to comment
Share on other sites

Link to post
Share on other sites

OK so let's assume this is 100% perfect. There is an issue of can/should you arrest someone who hasn't done anything wrong YET. They haven't broken any laws yet so they aren't a criminal yet. It's kind of the minority report issue. Can you arrest someone for something they haven't done yet nor even thought if doing it yet. I see no practical use for this even if 100% perfect. 

Link to comment
Share on other sites

Link to post
Share on other sites

Next project, AI predicts which dog is the tastiest based on paw size. 

 

You know what they probably already have that. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, mvitkun said:

Followed promptly by Minority Report lol.

 Motherboard  ROG Strix B350-F Gaming | CPU Ryzen 5 1600 | GPU Sapphire Radeon RX 480 Nitro+ OC  | RAM Corsair Vengeance DDR4 3000MHz 2x8Gb | OS Drive  Crucial MX300 525Gb M.2 | WiFi Card  ASUS PCE-AC68 | Case Switch 810 Gunmetal Grey SE | Storage WD 1.5tb, SanDisk Ultra 3D 500Gb, Samsung 840 EVO 120Gb | NAS Solution Synology 413j 8TB (6TB with 2TB redundancy using Synology Hybrid RAID) | Keyboard SteelSeries APEX | Mouse Razer Naga MMO Edition Green | Fan Controller Sentry LXE | Screens Sony 43" TV | Sound Logitech 5.1 X530

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, patrickjp93 said:

You think there can't be facial similarities between politicians and convicted criminals? You've committed premature pessimisation without any factual backing.

 

Also, the algorithm wouldn't be responsible for conviction. It would be responsible for targeting surveillance and police resources. Obviously no one should be arrested or convicted without a crime being committed, but having police presence reduces crime, period.

Of course there are facial similarities between politicians and convicted criminals. They both have faces.

As the AI is trained by being fed examples of convicted and non-convicted faces, it will learn to exclude the criminals who never get caught.

 

Increasing police presence needs to happen in many areas irrespective of whether people have faces that are similar to criminals. Increase surveillance everywhere to deter crime, and assist the police in catching and convicting criminals. If you focus surveillance, crime could move, rather than reduce.

5 hours ago, patrickjp93 said:

You're conflating your ideas of use with the actual intended use. Further, you forget the people who haven't been caught have visual similarities (dead eyes anyone?) with convicted criminals.

 

No it doesn't.

 

No, because having a predisposition to crime isn't even circumstantial evidence that can be used to put forward a prediction in any court in the U.S.. I don't know if the backwards Europeans or Asians want to use it that way, but it can't be used that way in the U.S.

The algorithm doesn't asses having a predisposition to crime, at best it assesses a predisposition to getting caught.

 

Too much faith has been put into a statistical pattern recognition paper "published" as open access to an online e-print archive, which doesn't appear to have any peer review process. This study has no intended use.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, laminutederire said:

You can fake a facial expression if you know what to do. Money isn't a parameter for that.

 

Okay, then why my master's level education wouldn't count as factual backing?

 

How is it not a bias? You know perfectly well that convicted criminal are only the ones you caught, and/or the one who couldn't afford to defend themselves in cases of light crimes. That creates a bias against poorer criminals.

 

Are they criminal because of their facial similarities? I don't think so. At best, the similarity is their education, which can be more similar based on ethnicity or family, both of which can induce physical ressemblance.

 

So you say with one the most unfair system of justice? (Bail fees for instance make for unfair process)

And you're going to keep the expression up full time? Further, you think algorithms can't see past the expression to the actual jaw and facial structure?!

 

Because you're full of crap as actual data science has already shown.

 

You're seeming to think white collar criminals never get convicted. You're the one biased.

 

No, and no one ever claimed otherwise.

 

The U.S. justice system is the most fair in the world, objectively. Innocent until proven guilty, you can get a bail bond (not fee), and you only lose that money if you end up convicted, more appeals than any other justice system in the world, and the standard of reasonable doubt is higher than any other justice system. Is it perfect? No, but it is the best the world has.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, DrMikeNZ said:

Of course there are facial similarities between politicians and convicted criminals. They both have faces.

As the AI is trained by being fed examples of convicted and non-convicted faces, it will learn to exclude the criminals who never get caught.

 

Increasing police presence needs to happen in many areas irrespective of whether people have faces that are similar to criminals. Increase surveillance everywhere to deter crime, and assist the police in catching and convicting criminals. If you focus surveillance, crime could move, rather than reduce.

The algorithm doesn't asses having a predisposition to crime, at best it assesses a predisposition to getting caught.

 

Too much faith has been put into a statistical pattern recognition paper "published" as open access to an online e-print archive, which doesn't appear to have any peer review process. This study has no intended use.

Red herring fallacy.

 

No it won't. Rare is the criminal who isn't caught, and all crime types have a large sample pool to draw from.

 

Not true. Already lawful communities tend to maintain their standards irrespective of police presence, so surveillance needs to be targeted and focused, not increased everywhere.

 

Not true. It assesses predisposition to commit crime. Actually read the study. Any good data scientist will tell you you're full of crap. READ IT!

 

It's a master's thesis. It did undergo peer review in the ACM and IEEE circles.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, patrickjp93 said:

Not true. It assesses predisposition to commit crime. Actually read the study. Any good data scientist will tell you you're full of crap. READ IT!

 

It's a master's thesis. It did undergo peer review in the ACM and IEEE circles.

I had read it, it has been a great laugh. If it was submitted to either J ACM or IEEE it would seem odd for it to have been released outside of their journals, unless it was rejected. But I would not be surprised if it was submitted unsuccessfully to both.

I cannot imagine any university that would accept that as a masters thesis.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×