Jump to content

Apple is (going to) scan your Apple devices

fUnDaMeNtAl_knobhead
1 hour ago, Bombastinator said:

Doesn’t make it right though.  A good hard look needs to be taken at AI procedural legal actions.  Too many mistakes are made.

It’s extremely improbable that you’ll end up with a hash that matches illegal content by accident. Like… nearly impossible. If you get caught up with something that gets flagged, there’s a 99.999999999% chance that you’re doing something fucked up IF this indeed a hash based system.

 

That said, I also imagine there would be more confirmation in the algorithm. Is the image the same dimensions? Same color space? Etc. When you factor in those variables to a matching hash, you have a match that is a hair away from being statistically absolute. 
 

Again though, this is if they’re using hash values.

MacBook Pro 16 i9-9980HK - Radeon Pro 5500m 8GB - 32GB DDR4 - 2TB NVME

iPhone 12 Mini / Sony WH-1000XM4 / Bose Companion 20

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, linux fanboy said:

Apple said the system is automated and is "designed with user privacy in mind," with the system performing scans on the device before images are backed up to iCloud.

I know this is supposedly for a good purpose..... But then again I'm loving (NOT) how this is phrased with how it's stated to work.
Ripe for abuse and I can see it being so. 

How can it be designed for privacy when they are blatantly violating it to do this?
Contradictory terms like "Military Intelligence" apply with this kind of stuff.

And it doesn't specify what images - Could be any and all images regardless, being backed up to the cloud whether you've been tagged as having something bad on your phone or not.

"If you ever need anything please don't hesitate to ask someone else first"..... Nirvana
"Whadda ya mean I ain't kind? Just not your kind"..... Megadeth
Speaking of things being "All Inclusive", Hell itself is too.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Caroline said:

They're probably doing it already but not telling anyone, ever since "smart" crap devices came out. No wonder why they're pushing so hard trying to convince everyone that's the way to go.

 

Corps and agencies probably know when any android user takes a dump and how big it is. Fitness bands, smart watches, IOT bs starting to become widespread just because it's cheaper to manufacture than 5 years ago.

 

I wouldn't talk about what I'm eating tonight over a smartphone and there are actual criminals taking pics and giving out all the details of their crimes using iphones? just.... dude.........

 

If they did and it can be proved shit will hit the fan.  People did data retrieval’s from various companies under some law. One of them was televised by the guy that did “supersize me”. He got inches of paper from some companies but from Apple only a few sheets that took even longer to get and had only data on them pertaining to the working of the actual device.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Bombastinator said:

Yeah.  I didn’t say it was easy.  Parenting has long famously been the hardest job anyone ever does. I’m not arguing against being a good parent I’m just pointing out that what looks like an easy choice may be less simple than it looks. 

agree to disagree. it's an easy choice. Safety of my kids is easy. Just because some don't make the right choice doesn't mean it's a complicated choice. 

Insanity is not the absence of sanity, but the willingness to ignore it for a purpose. Chaos is the result of this choice. I relish in both.

Link to comment
Share on other sites

Link to post
Share on other sites

What does Apple do with your iPhone's data after the AI tested that you do have a chance of child abuse brick it, read every data, give it to police ?

I have an ASUS G14 2021 with Manjaro KDE and I am a professional Linux NoOB and also pretty bad at General Computing.

 

ALSO I DON'T EDIT MY POSTS* NOWADAYS SO NO NEED TO REFRESH BEFORE REPLYING *unless I edit my post

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

That is the big question: how open is this to abuse? We don’t know how much.  There have been stuff presented in this thread about how abuse could be avoided.  Too many details could allow pedos to use these limits to avoid the issue though.  Some details could be revealed.  For example,  if they’re scanning for known stuff only by checksum or something simply adding a bit could change the checksum thus making it less effective. They could state they are scanning by checksum but then do something more elaborate but still limited, which could still catch people doing this though. This has none though which makes it seem like a potential problem.  I don’t know if that’s simply omission or what is actually going on though.

 

The scary bit is if they record  data for things besides kiddie porn, or if they use use some sort of visual recognition system that creates false positives.

Therein lies the eternal question: how much intrusion is too much? I agree that going for CP is an absolutely legitimate purpose. But, quite frankly, what pictures I have on my phone is no one's business but mine. We've seen how all too often, particularly in the states, how privacy has constantly been chipped away at and undermined by the always ambiguous "it's for national security" and we've seen how law enforcement and governments love to use smartphones to target and silence journalists, dissenters, opposition leaders/members, and so on.

 

 See, this is why we can't have nice things because assholes, like pedophiles, will always ruin it for the rest of us. Same thing with bitcoin, come to the think of it and the internet in general.

System Specs: Second-class potato, slightly mouldy

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, NotTheFirstDaniel said:

If my interpretation of all the press coverage on this is correct, then the people in this thread are wildly exaggerating what Apple is doing here. But correct me if I'm wrong.

 

Apple is not uploading all of your local photos to Apple servers to identify if the picture contains potential child abuse. They are "scanning" iCloud Photos (something that isn't even E2E encrypted) on device by matching hashes from a photo that's about to be uploaded to iCloud Photos to trusted databases that contain probably millions of these hashes of known CSAM. This is not an AI that is going to scan each individual pixel of a photo, it's only comparing hashes (for now at least). The wording that Apple uses makes me believe that this will be for iCloud Photos only So if you don't have iCloud Photos, nothing will change.

 

This is similar to what other companies, like Microsoft, Facebook, and Twitter, have been doing for years. Hell, Google's been doing it, apparently, since 2008.

 

No, this is not an infringement of your "constitutional rights". You signed a license stating you were OK with giving Apple control of the data you upload to iCloud when you signed up for the service. This is the same reason why the FBI can subpoena Apple for iCloud Photos and Backups. It's technically not on "your property" anymore.

 

Now personally, I'm not one to use the "slippery slope" trope, but when Apple comes out and says that this system is going to "grow and evolve", you're forced to wander a bit from reality. I'm not suggesting a "big brother" scenario, but what if they go after other illegal things, such as the possession of pirated content? In my opinion, viewing CSAM is much much more morally wrong than pirating a movie from 20th Century Fox or a music video from Sony Entertainment, huge multi-billion dollar companies who would never even notice 1-5% of their viewers pirating content.

 

And with all of this said, at the end of the day, this hurts their privacy stance. Why should I trust Apple and iCloud Photos when they just set a precedence that they are willing to look through your iCloud? That there's a chance (a small chance, but a chance) that you could be done in while being completely innocent? At the end of the day, it just shows that A) most "privacy driven" companies are privacy-driven to a point and that B) the best cloud is the cloud you can make at home, especially if you're paying like what, $10/month for a cloud service? By the end of the year, your already halfway towards an easy NAS setup. Just do it yourself.

I got that particular bit about iCloud from reading the article. 
 

Generally speaking, with data I do not wish for others to have (especially if it has the potential to land me a prison sentence), I do not trust cloud providers, flat out. If I must have a backup however, I’ll encrypt said data before uploading to a cloud service. Thus even with a backdoor key for the storage itself, the cloud vendor only gets an encrypted blob. They’re free to try to brute force it though if they really feel like it. 
 

The other tidbit that draws attention is where it mentions that photos/hashes are scanned on device before the upload. Even if this only occurs when using iCloud, depending on implementation (does this scan the source file directly, or is an unencrypted copy made and  then scanned), this toes the line uncomfortably closely. If the content scanning were only carried out on the iCloud servers, I’d have no objection. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

I feel this 9to5mac article covered the privacy and security failures that will occur if Apple implements this: 

https://9to5mac.com/2021/08/05/scanning-for-child-abuse-images/

 

I understand the intention behind such a system, but its impossible to implement without false positives and without being hypocritical about user consent and privacy. As far as I'm concerned, even one false positive is too many and peoples lives could be ruined because an algorithm decided to flag an innocent person. 

 

Oh and another thing, the whole iMessage thing scanning incoming photos to warn if they are sexually explicit is peak puritanism. Like I get it, unsolicited lewds are not cool, but you should be able to opt-out. Nothing kills the mood with your SO more than constant warnings bout explicit content or having to manually allow a photo to go through. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, linux fanboy said:

What does Apple do with your iPhone's data after the AI tested that you do have a chance of child abuse brick it, read every data, give it to police ?

And that's the main concern. False positives when dealing with matters like this is extremely dangerous to users. Any precautions or manual review inherently break user privacy and fly in the face of "What happens on iPhone stays on iPhone," a phrase that Apple is very proud of. It appears to be meaningless. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DrMacintosh said:

And that's the main concern. False positives when dealing with matters like this is extremely dangerous to users. Any precautions or manual review inherently break user privacy and fly in the face of "What happens on iPhone stays on iPhone," a phrase that Apple is very proud of. It appears to be meaningless. 

Which implies there is something off with all of this.  I don’t know what.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, DrMacintosh said:

Oh and another thing, the whole iMessage thing scanning incoming photos to warn if they are sexually explicit is peak puritanism. Like I get it, unsolicited lewds are not cool, but you should be able to opt-out. Nothing kills the mood with your SO more than constant warnings bout explicit content or having to manually allow a photo to go through. 

I'm not into sexting but some people are and more often than not, it's between two or more consenting adults having fun. I don't think it's right for Apple to be the hypocritical puritan who would tell me what photos I can and cannot send.

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, MageTank said:

Man, imagine you send a picture of your dongle to a significant other and Apple flags it as child abuse. Talk about destroying ones ego, lol.

 

I get where people are coming from about the privacy concerns, but if this is being done entirely by a remote database that is entirely automated, isn't saving results that aren't abuse, and no human has access to it, I'd probably be fine with it personally.

 

Those are some pretty big IF's...

 

I am not entirely sure this is the point though. It would potentially stop minors that are being groomed to send pictures from these devices on various apps to people online if it's flagging what is taken and sent outbound. If it alerts the children's parents, that could allow them to intervene before further harm takes place. I don't envy parents having to take care of their kids in a digital world, sounds like an utter nightmare honestly.

 

That said, I do agree with your cynicism about Apple's potential for false altruism here. It seems odd to me that they take a hard stance in the past of "We refuse to give the FBI access to our devices, even for criminal investigations!" to "We totally want to stop crimes and stuff, but we really have to scan your device to do it. We care about stopping crime you guys, we are super serious about it". I guess Apple is only fine going all-in in this context as long as they hold the winning hand at all times?

The first thought that comes to mind is this is gonna kill minors sending nudes to each other. Always thought that was a conundrum in itself as when you are young and sending Pics to your significant other but you both aren't 18 yet you probably don't realize that you can get in some serious trouble for just having those photos. I would imagine this is probably what it will most commonly stop and that probably is a good thing regardless. I am still uncertain of the whole scanning stuff though. It's seems pretty invasive to me and somewhat abusable. I mean you are giving up privacy in the name of security but the question becomes won't they simply keep asking for you to give up more privacy later for some other reason that might not be nearly as justified?

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, linux fanboy said:

Summary

Apple is planning to scan your phone or more accurately photos or messages or content that is related to child abuse and notifying the user or sending it to a Apple employ 

who will take action.

 

My thoughts

 This technology could be helpful but it would cause privacy concerns because essentially Apple is reading everything you send, look, or read plus it also reduces your phones performance. I think apple should make this mandatory for 1 month or some period so that it could determine if a person has a chance to abuse a child than like make it mandatory for their iPhone or if they have no signs of child abuse make it optional ?

 

 

Sources

https://www.cnet.com/tech/services-and-software/apples-plan-to-scan-phones-for-child-abuse-worries-privacy-advocates/

what I am actually surprised is that Neuralhash can look for distorted images, match with the hashed file and add it to the db. Afaik, implementing a hashing mechanism that can look at various objects in the image and then hash them separately, with all of its metadata and add them to a single hash is no mean feat.(to be clear, running an image match is easy, but using it with hashes is something pretty new)

 

Secondly, I am concerned about false positives. Apple has done pretty shitty sw implementations in the past(apple maps), and I would love to see how they would be creating a huge dataset for this stuff. Maybe an intern has to look at the fucked up images all day? Probably not.

 

Link to comment
Share on other sites

Link to post
Share on other sites

While i am against these types of disgusting activities, I really don't think that this is the way to go along with this. Apple is a corporate entity after all, and at anytime these mechanisms can be weaponised. It is like someone put a gun against your head and saying, "Trust me, I won't shoot, unless you do this sick stuff". It also gives apple some of the powers enjoyed by a moral companion. What is stopping them from changing their stance tomorrow and saying that you can't share images about this topic because it is against our values? Apple may be harping about its privacy today, but what is stopping them from becoming another google. when it comes to privacy Big Tech doesn't deserve the benefit of the doubt, even Apple(even though they have a much better track record than others)

 

On the other hand, stopping these disgusting, sick and degenerate activities is very important as well. I don't have much care about this matter, since I don't have an iphone, use gservices and don't do any of this fucked up shit. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, WolframaticAlpha said:

what I am actually surprised is that Neuralhash can look for distorted images, match with the hashed file and add it to the db. Afaik, implementing a hashing mechanism that can look at various objects in the image and then hash them separately, with all of its metadata and add them to a single hash is no mean feat.(to be clear, running an image match is easy, but using it with hashes is something pretty new)

 

Secondly, I am concerned about false positives. Apple has done pretty shitty sw implementations in the past(apple maps), and I would love to see how they would be creating a huge dataset for this stuff. Maybe an intern has to look at the fucked up images all day? Probably not.

 

Apple Maps seems to work as well as or better than Google maps in my are.. now.  There was a time where it briefly did all manner of weird things.  I thought that was due to Google messing with things.  Was a weird temporary thing though.. usable, complete garbage, then usable again.  Doesn’t speak to code quality though.  If I wanted to pick a bizarre apple coding choice I would pick how the OS for 6800k macs was written in a training language.  So much “WHYY” there.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bombastinator said:

Apple Maps seems to work as well as or better than Google maps in my are.. now.  There was a time where it briefly did all manner of weird things.  I thought that was due to Google messing with things.  Was a weird temporary thing though.. usable, complete garbage, then usable again.

Doesn't matter, With such a system, if you can't eliminate false positives, then probably don't do it, cause then you will really be messing up with someone's life.

 

And it is virtually impossible to remove false positives.

 

 

On a separate note, don't attack people who are against this move by calling them child molesters. Just because someone likes their privacy, doesn't mean that they also support child molesting.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, WolframaticAlpha said:

Doesn't matter, With such a system, if you can't eliminate false positives, then probably don't do it, cause then you will really be messing up with someone's life.

 

And it is virtually impossible to remove false positives

I dunno how it’s written.  I just follow the blue line.  Seems to work.  I’m home.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

wont-somebody-please-think-of-the-childr

 

As a general rule I will dismiss the slippery slope argument outright however in this case I do think its a valid stance.

 

This is how it begins, they go after the most hateful crimes and frame the tech as a way of helping parents out. Once its established it slowly expands and encroaches on more and more aspects of our lives until we reach the point where our phones are acting as an enforcement tool for a business acting as a private police force.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, WolframaticAlpha said:

While i am against these types of disgusting activities, I really don't think that this is the way to go along with this. Apple is a corporate entity after all, and at anytime these mechanisms can be weaponised. It is like someone put a gun against your head and saying, "Trust me, I won't shoot, unless you do this sick stuff". It also gives apple some of the powers enjoyed by a moral companion. What is stopping them from changing their stance tomorrow and saying that you can't share images about this topic because it is against our values? Apple may be harping about its privacy today, but what is stopping them from becoming another google. when it comes to privacy Big Tech doesn't deserve the benefit of the doubt, even Apple(even though they have a much better track record than others)

 

On the other hand, stopping these disgusting, sick and degenerate activities is very important as well. I don't have much care about this matter, since I don't have an iphone, use gservices and don't do any of this fucked up shit. 

There is all this aregument that Apple is worse than Google because of this thing.  There’s two problems.  Apple may not do this thing the way the article states because there is critical data missing, but apparently Google DOES do this already and has for years. The worst this can be is “Apple is just as bad as Google now” it would be incredibly foolish of them to do this.  Apple HAS done foolish things before which makes this harder to poopoo, but there’s something off about this whole thing.  I want more data before I jump to conclusions. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

People keep saying "it's just image hash" so "think of the children!". No, no it's not. Static file hashes would change entirely if you'd just resize photo by 1 pixel width. Or recompress it. We're talking image "height maps" so to speak that are an image fingerprint that can technically be reconstructed back to rough approximation of an image. Because image matching is then done with similarity %. You can recompress, resize and flip image all you want, its "height map" doesn't change.

 

Imagine shape of an actual map with applied height mapping. You know, those circular shaped lines on maps that show hills and mountains? That's how they fingerprint photos to be immune to resizing, recompressing or change of image formats. Just like you could recognize location from resized or recompressed map based on its height map, the same you can with other things this way. And it sets a terrible precedent for privacy. And this isn't just what you upload to iCloud. It's all done on your phone and because they need to match it, they need to send these fingeprints (height maps) to their servers. And since they can't have such imagery on their servers, they'll either be matching them to pre-stored fingeprints or they'll be reconstructing images from these fingerprints and then deciding whether it is resembling questionable content that involves children or not based on those fingerprint/height maps. Imagine it like reconstructing black and white photos with poor quality from that, but they'd still be getting back shapes they shouldn't be getting because PRIVACY.

 

This shit just can0't go through otherwise Apple's entire privacy shtick is a big fat lie and whoever tries to flag people who are against this privacy invasion as pedo enablers or defenders is a freaking moron. But this is exactly why they are doing it. Because "think of the children" and no one will question it or dare to question because everyone will look at them weird if they do. To make things worse, Apple already needs to "participate" in PRISM. Now they'll be feeding PRISM with image approximations of everything people snap from hundreds of millions of iPhones.

 

This crap NEEDS to be stopped. But I know for a fact it won't and it'll go through, no one will question it after few months and it'll become the new norm.

Link to comment
Share on other sites

Link to post
Share on other sites

While I am all for trying to catch and prosecute these kinds of behaviors by users...things really become what is the expectation of privacy (and the slippery slope argument).

 

There have already been cases of people going to court for possession, only for the courts to find out later the images were of an adult film actress who happened to look not of age.  There is also the potential abuse of an invalid hash being put into the system to either target someone specifically, as now you have probably cause to search their phone, or accidentally (like the above case)

 

6 minutes ago, Master Disaster said:

This is how it begins, they go after the most hateful crimes and frame the tech as a way of helping parents out. Once its established it slowly expands and encroaches on more and more aspects of our lives until we reach the point where our phones are acting as an enforcement tool for a business acting as a private police force.

The way I see it, today it would be looking for the hash, then to stop cyber terrorists it would be finding the hashes of exploit source code (on computer), when that's proven to be effective Apple might as well expand it to find "illegal" music, and then might as well expand hashes to read texts.

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, RejZoR said:

People keep saying "it's just image hash" so "think of the children!". No, no it's not. Static file hashes would change entirely if you'd just resize photo by 1 pixel width. Or recompress it. We're talking image "height maps" so to speak that are an image fingerprint that can technically be reconstructed back to rough approximation of an image. Because image matching is then done with similarity %. You can recompress, resize and flip image all you want, its "height map" doesn't change.

 

Imagine shape of an actual map with applied height mapping. You know, those circular shaped lines on maps that show hills and mountains? That's how they fingerprint photos to be immune to resizing, recompressing or change of image formats. Just like you could recognize location from resized or recompressed map based on its height map, the same you can with other things this way. And it sets a terrible precedent for privacy. And this isn't just what you upload to iCloud. It's all done on your phone and because they need to match it, they need to send these fingeprints (height maps) to their servers. And since they can't have such imagery on their servers, they'll either be matching them to pre-stored fingeprints or they'll be reconstructing images from these fingerprints and then deciding whether it is resembling questionable content that involves children or not based on those fingerprint/height maps. Imagine it like reconstructing black and white photos with poor quality from that, but they'd still be getting back shapes they shouldn't be getting because PRIVACY.

 

This shit just can0't go through otherwise Apple's entire privacy shtick is a big fat lie and whoever tries to flag people who are against this privacy invasion as pedo enablers or defenders is a freaking moron. But this is exactly why they are doing it. Because "think of the children" and no one will question it or dare to question because everyone will look at them weird if they do. To make things worse, Apple already needs to "participate" in PRISM. Now they'll be feeding PRISM with image approximations of everything people snap from hundreds of millions of iPhones.

 

This crap NEEDS to be stopped. But I know for a fact it won't and it'll go through, no one will question it after few months and it'll become the new norm.

The question though is what do they DO with them, WHY they are doing it, and what data do they keep?  This is sort of the same problem as the British closed circuit police camera system which is withstandble only because police need a warrant to access it.  It is a worrisome claim.  So worrisome and so counter to previous claims that it makes me wonder.  Corporations don’t have the limits placed on them that law enforcement does.  This has so much potential blowback in it that I wonder if something has been missed.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, RejZoR said:

People keep saying "it's just image hash" so "think of the children!". No, no it's not. Static file hashes would change entirely if you'd just resize photo by 1 pixel width. Or recompress it. We're talking image "height maps" so to speak that are an image fingerprint that can technically be reconstructed back to rough approximation of an image. Because image matching is then done with similarity %. You can recompress, resize and flip image all you want, its "height map" doesn't change.

 

Imagine shape of an actual map with applied height mapping. You know, those circular shaped lines on maps that show hills and mountains? That's how they fingerprint photos to be immune to resizing, recompressing or change of image formats. Just like you could recognize location from resized or recompressed map based on its height map, the same you can with other things this way. And it sets a terrible precedent for privacy. And this isn't just what you upload to iCloud. It's all done on your phone and because they need to match it, they need to send these fingeprints (height maps) to their servers. And since they can't have such imagery on their servers, they'll either be matching them to pre-stored fingeprints or they'll be reconstructing images from these fingerprints and then deciding whether it is resembling questionable content that involves children or not based on those fingerprint/height maps. Imagine it like reconstructing black and white photos with poor quality from that, but they'd still be getting back shapes they shouldn't be getting because PRIVACY.

Hashes? Yeah, no, this system is not gonna be using hashes.

 

Apple are framing it as a tool to help parents stop their kids from sending stuff to strangers on the internet. There will be no hash available for a photo someone has literally just taken so one of its main marketed features would simply not work.

 

This is an AI that's analysing your data and as we all know, AIs are infallible and understand context. /S

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Things that can be hashed: Basically everything.
Things that change the hash of an image: Basically everything. 

 

Just by saving it in a different format or cutting off one pixel row at the bottom or changing the white balance breaks that algorithm, if it works the way I think it does. 

 

I'm very "pro" regarding the fight against CP, but that system is too easy to circumvent and too easy to apply to different files. Find illegally downloaded movies or songs? Find images that are shared often in far right or far left communities? Even if Apple has a track record of not getting political and keeping privacy high, that technology sends some weird vibes.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Laborant said:

Things that can be hashed: Basically everything.
Things that change the hash of an image: Basically everything.

For hashing to be effective it needs an underlying database which is great for existing known content, its pretty useless against kids snapping pics of themselves of adults who are actively committing abuse.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×