Jump to content

Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos

In addition to Advanced Data Protection in iCloud, Apple has announced that it has abandoned its plans to screen your photos for CSAM. Apple Provided this statement:

 

Quote

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

 

My thoughts

Back in 2021 when Apple announced its intentions to use this hashing tool to check against known CSAM, people were furious. Despite Apples attempts to show that the service was "secure enough" and had privacy in mind (despite its end goals to scan every photo and document you have for potentially incriminating photos), there was no good way to spin it. The feature was lambasted by security professionals, politicians, and users around the globe. I'm happy that Apple has decided to not push this feature forward and has instead decided to double down on privacy, even though adding full encryption puts more responsibility on the user and abandoning this feature means potential criminals will have to be caught outside of iCloud. 

 

Sources

https://www.macrumors.com/2022/12/07/apple-abandons-icloud-csam-detection/

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

Good but the damage of proposing it has already been done. Apple will forever be known as having wanted to implement it and only backing down when faced with incredible opposition.

 

Also, we can only take Apple at their word that they have backed down because IOS isn't open source. So I'll believe it when an independent auditor is able to view IOS' source code to verify that it's gone.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, DrMacintosh said:

In addition to Advanced Data Protection in iCloud, Apple has announced that it has abandoned its plans to screen your photos for CSAM. Apple Provided this statement:

 

 

My thoughts

Back in 2021 when Apple announced its intentions to use this hashing tool to check against known CSAM, people were furious. Despite Apples attempts to show that the service was "secure enough" and had privacy in mind (despite its end goals to scan every photo and document you have for potentially incriminating photos), there was no good way to spin it. The feature was lambasted by security professionals, politicians, and users around the globe. I'm happy that Apple has decided to not push this feature forward and has instead decided to double down on privacy, even though adding full encryption puts more responsibility on the user and abandoning this feature means potential criminals will have to be caught outside of iCloud. 

 

Sources

https://www.macrumors.com/2022/12/07/apple-abandons-icloud-csam-detection/

What will be very funny is how to people who supported apple for this earlier are going to spin it now

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, AluminiumTech said:

Good but the damage of proposing it has already been done. Apple will forever be known as having wanted to implement it and only backing down when faced with incredible opposition.

Thing is I respect a company that listens to feedback and makes changes based on that. 

Link to comment
Share on other sites

Link to post
Share on other sites

Good. It was a terrible idea to begin with and I am happy they scrapped it.

Even if their intentions were good (which I believe they were), the whole "let's treat everyone has a potential criminal and maybe we'll catch a few" mentality that often gets used to push for various things always rubs me the wrong way.

Not to mention this had the potential to be abused if it ended up in the wrong hands. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, AluminiumTech said:

Also, we can only take Apple at their word that they have backed down because IOS isn't open source. So I'll believe it when an independent auditor is able to view IOS' source code to verify that it's gone.

We do have to take Apple's word on it, but there's also no good reason to think the company is lying.

 

And the simple reality is that most major consumer platforms are unlikely to ever be truly open source; even Android has proprietary code if your phone has Google apps. We can't all be Richard Stallman, waiting for the FOSS paradise that will never come. So we either put a reasonable amount of trust in companies like Apple, Google and Microsoft, or become a Stallman-like hermit who has less real-world freedom than a Chinese iPhone user.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LAwLz said:

Good. It was a terrible idea to begin with and I am happy they scrapped it.

Even if their intentions were good (which I believe they were), the whole "let's treat everyone has a potential criminal and maybe we'll catch a few" mentality that often gets used to push for various things always rubs me the wrong way.

Not to mention this had the potential to be abused if it ended up in the wrong hands. 

If you're going to scan for that content who apple did it was the best way to scan.   All the other photo services currently scan for this server side. Scanning client side is much better for a few key reasons:

* It is possible for third parties to audit what the hashes that are being used are (yes this was done by sec firms when this feature was in beta)
* Worth remembering this local scanning was only scanning content as you were about to upload it. It was not scanning local content that was set to not be synced. 
* Much harder for potential abuse (govments etc) to be able to manipulate this than a server side scanning solutaiton
* even through at the time iCloud Photos was not end to end this system was designed to deal with that possibility such that it only resulted in the full decryption key being exposed if enough photos passed the threshold (each time the phone flagged a photo it put a small part of your encryption key with it in the metadata.. this meant by design a single image being flagged would not expose your lib only if you had 100s of images flagged were there enough parts of your crypto key to open up your lib to be inspected). 


When you compare this to what others are doing (already) in the form of server side scanning (google, etc) apples approach was much better.  

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, AluminiumTech said:

Good but the damage of proposing it has already been done. Apple will forever be known as having wanted to implement it and only backing down when faced with incredible opposition.

 

They backed down very fast, within in being in beta (optionaly that you turned on) for less than 3 weeks.  And it is worth nothing every other cloud photo provider does this currently without any of the protections that apples solution had. If you're upset at the idea of what apple did then you would be extremely upset at what google are doing and have been doing for the last 5+ years server side. Scanning client side just before you upload the image is much better than scanning server side after the image is uploaded since client side the list of hashes can be audited (and this was done by sec ops companies) and you cant apply new hashes retrospectively to an image after upload. Also how it was done meant only once enough images were flagged by your device (on upload) was enough of your crypto key exposed to let anyone look at these images so just having a few random images be flagged did not result in the police looking at them as they were still encrypted. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, WolframaticAlpha said:

What will be very funny is how to people who supported apple for this earlier are going to spin it now

I don't think anyone did.

 

It's generally a bad idea to let the device "decide" to do something against it's users wishes, even if it's deplorable. That's a slippery slope of "well if apple can block CEI, then apple can block all porn, be it real, cartoon, snapchat filters, ai-generated, etc"

 

If there is a concern about CEI, then that belongs to the public access point. If one adult receives images from someone they had consent from, and then the machine goes "uh oh, that looks like CEI, better block it", you see how it starts mistaking images and work around it. There was no reason for CSAM to exist, even if good-intentioned.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Kisai said:

I don't think anyone did.

 

 

I remember quite a large thread on the topic back when it was news that contained a few who supported it vehemently.  But for whatever reason I cannot find it using the search function.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, mr moose said:

I remember quite a large thread on the topic back when it was news that contained a few who supported it vehemently.  But for whatever reason I cannot find it using the search function.

I found the thread but I didn't really see anyone defending it in the 5 or so pages I went through. 

 

 

There were some people talking about the technical details and saying things like "I want more info before deciding if this is good or bad". That's as close to someone defending it that I could find. Everyone else was against it for varying degrees of legitimacy. 

Link to comment
Share on other sites

Link to post
Share on other sites

I doubt anything is killed. I wouldnt be surprised if they pulled a Google and kept on developing this thing further under a different name and covers.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, LAwLz said:

I found the thread but I didn't really see anyone defending it in the 5 or so pages I went through. 

 

 

There were some people talking about the technical details and saying things like "I want more info before deciding if this is good or bad". That's as close to someone defending it that I could find. Everyone else was against it for varying degrees of legitimacy. 

The two most usual suspects for defending apple were more to the end (page 9 or so).  But it wasn't as bad as I remember (thank fuck for failing old man memory, the world may not be as shit as I think it is).

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/8/2022 at 9:38 PM, AluminiumTech said:

Apple will forever be known as having wanted to implement it and only backing down when faced with incredible opposition.

You mean like how all companies are viewed? LMG, Microsoft, Apple, Google, ... Air Canada, the list goes on. I don't think there is any company that will immediately back down once faced with any slight bit of opposition. 

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/10/2022 at 9:13 PM, LAwLz said:

I found the thread but I didn't really see anyone defending it in the 5 or so pages I went through. 

 

 

There were some people talking about the technical details and saying things like "I want more info before deciding if this is good or bad". That's as close to someone defending it that I could find. Everyone else was against it for varying degrees of legitimacy. 

Oh well, then I must be misremembering it. Apologies.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/9/2022 at 12:40 PM, Kisai said:

I don't think anyone did.

 

It's generally a bad idea to let the device "decide" to do something against it's users wishes, even if it's deplorable. That's a slippery slope of "well if apple can block CEI, then apple can block all porn, be it real, cartoon, snapchat filters, ai-generated, etc"

 

If there is a concern about CEI, then that belongs to the public access point. If one adult receives images from someone they had consent from, and then the machine goes "uh oh, that looks like CEI, better block it", you see how it starts mistaking images and work around it. There was no reason for CSAM to exist, even if good-intentioned.

I think the biggest issue would he teenagers and nudes. It's an unfortunate truth that most of cp out their is probably just kids sending each other nudes and if Apple started to scan for that I can imagine it would be a nightmare. I mean do they now send all the teenagers who are sending nudes to each other to jail for cp? 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/15/2022 at 2:44 AM, Brooksie359 said:

I think the biggest issue would he teenagers and nudes. It's an unfortunate truth that most of cp out their is probably just kids sending each other nudes and if Apple started to scan for that I can imagine it would be a nightmare. I mean do they now send all the teenagers who are sending nudes to each other to jail for cp? 

Apple's system only detected photos from the known CSAM database, so it would not have flagged those photos. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×