Jump to content

Apple is (going to) scan your Apple devices

fUnDaMeNtAl_knobhead

@LinusTechVideo about this and how Apple just burned years of their image as a privacy-conscious tech company 

 

Don't buy Apple M1 computers with 8GB of RAM

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, just_dave said:

@LinusTechVideo about this and how Apple just burned years of their image as a privacy-conscious tech company 

 

Didn’t know you were a writer for LMG, you wanna also dictate how long should the video be and the release schedule? 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, saltycaramel said:

Didn’t know you were a writer for LMG, you wanna also dictate how long should the video be and the release schedule? 

I'm not sure what the meaning of your post is supposed to be @saltycaramel

____________________________________________________________________________________________________________________________________

 

 

____________________________________________________________________________________________________________________________________

pythonmegapixel

into tech, public transport and architecture // amateur programmer // youtuber // beginner photographer

Thanks for reading all this by the way!

By the way, my desktop is a docked laptop. Get over it, No seriously, I have an exterrnal monitor, keyboard, mouse, headset, ethernet and cooling fans all connected. Using it feels no different to a desktop, it works for several hours if the power goes out, and disconnecting just a few cables gives me something I can take on the go. There's enough power for all games I play and it even copes with basic (and some not-so-basic) video editing. Give it a go - you might just love it.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, pythonmegapixel said:

I'm not sure what the meaning of your post is supposed to be @saltycaramel

 

Joking about the fact that he spoonfed the whole “hot take” of the video to LTT? Maybe they have a completely different and more nuanced idea about this?

Link to comment
Share on other sites

Link to post
Share on other sites

Since we’re at it, I’m gonna give some one-liner ideas to LMG writers for the video as well

 

1) “What is the sound of an on-device scan that falls in a forest if nobody is listening?” - the output of the on-device scan (called “security vouchers”) sits there doing nothing in the form of cryptographically unreadable gibberish until the pictures are uploaded to iCloud anyway, so the so called on-device scan may as well be a server-side scan for all intents and purposes

 

2) “Waiter, there’s some Apple in my iPhone, remove it at once!” - some people are just now realising Apple local background processes (not phoning home) are already inside their iPhones doing all sort of indexing and deep analysis to pictures. The humanity!

 

3) “BREAKING NEWS: iOS is not open source! More shocking news after the break” - So theoretically, potentially, they could be pressured by governments into doing anything! Anything could happen! Even bypassing data encryption completely or sending a selfie of you to Xi The Pooh every minute! 

Link to comment
Share on other sites

Link to post
Share on other sites

So what if it's not open source? Everyone praising Android being open source and just ignoring the fat lump of Gapps bolted on top of it that do god knows what because they aren't open source. So, in the end it's the same shit.

Link to comment
Share on other sites

Link to post
Share on other sites

That was my point.

We already knew it’s not open source and that’s ok.

It was already a black box to us.

 

So suddenly giving the “slippery slope” treatment to any tool or feature “just because reasons” is preposterous.

 

We either trusted Apple already or we didn’t. Not much has changed. The so-called on-device scan would need to be modified heavily (since in its current implementation it’s  a cryptographic Fort Knox) to constitute an actual backdoor. We either think Apple has actually also implemented other stuff under the hood or we don’t. But that was true one week ago too. 

 

And we either understand that snowboarding over creepy slopes would pose for Apple an existential risk of being discovered red handed since actions have consequences or we don’t.

Link to comment
Share on other sites

Link to post
Share on other sites

I think one info most people seem to have just forgotten or are unaware about is that this only applies to photos uploaded in iCloud Library, not on device only photos like what the leaks said.

So there isnt a precedent being set for scanning photos that only exists on your personal device

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, saltycaramel said:

That was my point.

We already knew it’s not open source and that’s ok.

It was already a black box to us.

 

So suddenly giving the “slippery slope” treatment to any tool or feature “just because reasons” is preposterous.

 

We either trusted Apple already or we didn’t. Not much has changed. The so-called on-device scan would need to be modified heavily (since in its current implementation it’s  a cryptographic Fort Knox) to constitute an actual backdoor. We either think Apple has actually also implemented other stuff under the hood or we don’t. But that was true one week ago too. 

 

And we either understand that snowboarding over creepy slopes would pose for Apple an existential risk of being discovered red handed since actions have consequences or we don’t.

There seems to be an attitude in corporate action lately if “I don’t give two what problems I make unavoidable in the future as long as I can get paid and get out”.  Consequences from such things include the structure of the DMCA, housing projects under water, and most pollution.  I think this is what a lot of people are railing against. This is just potentially a latest example.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Pixel 4 and GrapheneOS, do as Edward does.

Workstation Laptop: Dell Precision 7540, Xeon E-2276M, 32gb DDR4, Quadro T2000 GPU, 4k display

Wifes Rig: ASRock B550m Riptide, Ryzen 5 5600X, Sapphire Nitro+ RX 6700 XT, 16gb (2x8) 3600mhz V-Color Skywalker RAM, ARESGAME AGS 850w PSU, 1tb WD Black SN750, 500gb Crucial m.2, DIYPC MA01-G case

My Rig: ASRock B450m Pro4, Ryzen 5 3600, ARESGAME River 5 CPU cooler, EVGA RTX 2060 KO, 16gb (2x8) 3600mhz TeamGroup T-Force RAM, ARESGAME AGV750w PSU, 1tb WD Black SN750 NVMe Win 10 boot drive, 3tb Hitachi 7200 RPM HDD, Fractal Design Focus G Mini custom painted.  

NVIDIA GeForce RTX 2060 video card benchmark result - AMD Ryzen 5 3600,ASRock B450M Pro4 (3dmark.com)

Daughter 1 Rig: ASrock B450 Pro4, Ryzen 7 1700 @ 4.2ghz all core 1.4vCore, AMD R9 Fury X w/ Swiftech KOMODO waterblock, Custom Loop 2x240mm + 1x120mm radiators in push/pull 16gb (2x8) Patriot Viper CL14 2666mhz RAM, Corsair HX850 PSU, 250gb Samsun 960 EVO NVMe Win 10 boot drive, 500gb Samsung 840 EVO SSD, 512GB TeamGroup MP30 M.2 SATA III SSD, SuperTalent 512gb SATA III SSD, CoolerMaster HAF XM Case. 

https://www.3dmark.com/3dm/37004594?

Daughter 2 Rig: ASUS B350-PRIME ATX, Ryzen 7 1700, Sapphire Nitro+ R9 Fury Tri-X, 16gb (2x8) 3200mhz V-Color Skywalker, ANTEC Earthwatts 750w PSU, MasterLiquid Lite 120 AIO cooler in Push/Pull config as rear exhaust, 250gb Samsung 850 Evo SSD, Patriot Burst 240gb SSD, Cougar MX330-X Case

 

Link to comment
Share on other sites

Link to post
Share on other sites

iPhone with disabled iCloud Backup (could be subpoenaed) and disabled iCloud Photos (if one is paranoid about this new system in the news) is plenty secure.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Tristerin said:

Pixel 4 and GrapheneOS, do as Edward does.

Which Edward?

A04C4E4D-AC8B-4C83-B104-5B1C098B6501.jpeg.a926ca891c199cd70368933d389042cc.jpeg

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, just_dave said:

@LinusTechVideo about this and how Apple just burned years of their image as a privacy-conscious tech company 

Given his history of bad takes with hasty responses, I’m certain that if Linus does choose to make a video in the topic, he’ll probably want to be damn certain of all his facts (technical and otherwise) before getting the script going.  The only decisive thing we’ve really got that is adverse to Apple is “Slippery Slope”.

 

Aka, have patience. I too would like to hear his composed thoughts on this, but only when he has all the facts. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Apple’s just confirmed to TechCrunch, for the first time in an completely unequivocal manner, that if iCloud Photos is disabled the NeuralHash won’t be running AT ALL.

 

In previous posts I had compared this to a tree falling in a forest when nobody is listening, but looks like the tree won’t even fall at all. Even better. Basically no reason to consider this an “on-device” scan, privacy-wise. Yes technically it’s on device but only if your data is ON ITS WAY to Apple’s servers anyway.

 

 

 

 

 

 

BF053EB7-F6BB-4FEF-BEF5-FFCFEDAF06ED.png

Link to comment
Share on other sites

Link to post
Share on other sites

Basically Apple’s has devised a way to search for known CSAM in a way that’s EQUIVALENT to server-side searches (that every other company do) but without decrypting our data when it’s already on their servers.


They are spared costs (more human teams, etc.) and problems with the law (by keeping that crap out of their servers). 

 

We are spared our data being decrypted once on servers unless MULTIPLE pics are matched to the known CSAM hashes, basically impossible (Apple claims it’s 1 in 1 trillion chance for that to happen by accident). 

 

Win-win.

 

The real jackpot for us would be if the theory that this is in preparation for complete E2EE (end to end encryption) for iCloud, every part of it bar none, will prove to be true. 

 

Coincidentally with the release of iOS15 new “Call a friend” account recovery options will be rolled out for both when you’re dead and when you’re still alive…this would fit with the above theory..

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/5/2021 at 4:25 PM, IkeaGnome said:

I'm torn. On one hand. Why isn't this already a thing?

On the other hand, this opens a whole can of worms I don't want open. "We already search their photos for abuse, why not add this activity, or that activity"

 

Edit: I'm more worried about what it COULD turn into them monitoring later on. 

"We see you have looked at pictures of guns 2 times the last week, but you live in a place where they are illegal. Let's notify authorities" or "We see you searched for 'how to avoid police radar scanners' time to turn you in"

I have an Apple fan buddy who pretty regularly watches To Catch a Predator. Would he be flagged?

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, JZStudios said:

I have an Apple fan buddy who pretty regularly watches To Catch a Predator. Would he be flagged?

Weird bugs happen, but If he would be so would many thousands of others at the same time. The impression I get is he would have to save episodes to iCloud for it to even be possible which it still might not be. Such shows do no show actual pornography in general but they might show a non pornographic image of a victim or former victim which could conceivably still be in the database (would go to the contents of the database.  I don’t know what is in it) 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/12/2021 at 5:33 AM, Bombastinator said:

Weird bugs happen, but If he would be so would many thousands of others at the same time. The impression I get is he would have to save episodes to iCloud for it to even be possible which it still might not be. Such shows do no show actual pornography in general but they might show a non pornographic image of a victim or former victim which could conceivably still be in the database (would go to the contents of the database.  I don’t know what is in it) 

Yer you would need to be taking screenshots/screen recordings uploading and uploading them it iCloud photos (the scanning takes place as part of the upload code, the phone does not scan images on the phone just the images while they are being prepared for upload) and the tag is attached to the image/video so if your not uploading an image/video it is impossible to tag something (it can't tag items that are not uploaded since there is not endpoint for the phone to send the tag to the tag much be sent as part of the offending image to iCloud photos). 
 

On 8/11/2021 at 9:29 AM, saltycaramel said:

Basically Apple’s has devised a way to search for known CSAM in a way that’s EQUIVALENT to server-side searches (that every other company do) but without decrypting our data when it’s already on their servers.

 

The key reason for doing this client side is it makes it much easier for apple to say `no` if they law enforcement request they scan for other content. Since client side changes require code changes on users devices apple can use the same legal defence they used in the past. Doing this client side results in a much less slippery slop than doing it server side. 


 

 

On 8/9/2021 at 9:24 PM, Bombastinator said:

The answers to questions 4 and 5 seem like hedging and assumptions about what Apple will or will not be able to forsee or resist.

What you need to remember is any change to how this works would require a code change client side, that means a software update, this is not something that can be changed through a server side toggle.  Any update down the road that makes changes would be easy for third party researches to detect and would become public knowledge very soon this is why doing this scanning client side is much better than server side were changes (forced or otherwise) would be hidden from third party observers. 

Legal apple is in just the same position before as they are after this change when it comes to not shipping code to client side to scan devices, it all falls down the the ability for the gov to compel apple to write new code and sign it (since any change would require new code).  So i don see why adding this type of scanning (to the upload pre-prossosor of iCloud photos, not the photo library) makes any changes to apples legal defence when it comes to not being forced to ship other client side scanning solutions.  Compared to doing server side scanning this solution is much better as well since when flagged you need at least 20 images to end up flagged before they are decrypt-able (there is some fancy math involved here) this is better than any server side scanning solution were as soon as a single image gets flagged (possibly a false positive) there is a legal obligation to report. The threshold of a number of images before the crypto lock is released will mean a much more private solution than any other cloud photo storage.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, hishnash said:

Yer you would need to be taking screenshots/screen recordings uploading and uploading them it iCloud photos (the scanning takes place as part of the upload code, the phone does not scan images on the phone just the images while they are being prepared for upload) and the tag is attached to the image/video so if your not uploading an image/video it is impossible to tag something (it can't tag items that are not uploaded since there is not endpoint for the phone to send the tag to the tag much be sent as part of the offending image to iCloud photos). 
 

The key reason for doing this client side is it makes it much easier for apple to say `no` if they law enforcement request they scan for other content. Since client side changes require code changes on users devices apple can use the same legal defence they used in the past. Doing this client side results in a much less slippery slop than doing it server side. 


 

 

What you need to remember is any change to how this works would require a code change client side, that means a software update, this is not something that can be changed through a server side toggle.  Any update down the road that makes changes would be easy for third party researches to detect and would become public knowledge very soon this is why doing this scanning client side is much better than server side were changes (forced or otherwise) would be hidden from third party observers. 

Legal apple is in just the same position before as they are after this change when it comes to not shipping code to client side to scan devices, it all falls down the the ability for the gov to compel apple to write new code and sign it (since any change would require new code).  So i don see why adding this type of scanning (to the upload pre-prossosor of iCloud photos, not the photo library) makes any changes to apples legal defence when it comes to not being forced to ship other client side scanning solutions.  Compared to doing server side scanning this solution is much better as well since when flagged you need at least 20 images to end up flagged before they are decrypt-able (there is some fancy math involved here) this is better than any server side scanning solution were as soon as a single image gets flagged (possibly a false positive) there is a legal obligation to report. The threshold of a number of images before the crypto lock is released will mean a much more private solution than any other cloud photo storage.

 

I don’t even remember what these were about. In any event Apple apparently released  further clarification after these posts were made so they’re too old to be relevant.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×