Jump to content

deblimp

Member
  • Posts

    41
  • Joined

  • Last visited

Awards

1 Follower

Profile Information

  • Member title
    Junior Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. She clearly just wants attention, don’t give her it.
  2. Good video. I appreciated the sponsor jokes, people complaining don’t get it.
  3. Welp I think this thread has officially devolved
  4. That’s the thing, if they did it that way they would not be able to keep the data truly encrypted server side. What happens now is the data is scanned, then encrypted, then stored on their servers. People get nervous that the encryption key is also stored on their servers, but that is just meant to allow you to access your backup if you loose all of your trusted devices.
  5. This hypothetical hack you’re talking about makes absolutely no sense and is a pretty ridiculous reason to oppose this.
  6. It seems pretty obvious to me that the reason for it is to protect Apple from liability for hosting CSAM images in their iCould servers. Thinking it’s all some grand conspiracy theory (((they))) are trying to cover up with the CSAM angle feels a little far fetched.
  7. Hmmm it appears you are unfamiliar with the word "encryption"
  8. And that is still end-to-end encrypted, unlike other options for backing up data…
  9. And anyone who understands how the technology is going to work.
  10. Try reading the article maybe? Apple Executive Defends Tools to Fight Child Porn, Acknowledges Privacy Backlash - WSJ.pdf
  11. Summary Shockingly it wasn't as bad as people fear-mongored last week. Quotes The way this is being described in media, including last weeks WAN show is totally misrepresenting how the feature is being implemented. The scanning for CSAM only happens to images stored on iCloud. Google, Facebook, and Microsoft already scan for the same images on their servers. Apple is scanning the images on the phone rather than their servers, but only on images about to be uploaded. The reason Apple is doing it this way is that it requires the database of images that are being searched for is on the device. Explicitly, the list of neural hashes. This allows anyone to verify that Apple is doing what they are claiming to be doing. Contrary to the approaches of Google et al. where the scanning is done server side and it is impossible to verify what they are searching for. Importantly, none of this applies if you choose not to use iCloud. Sources https://www.wsj.com/video/series/joanna-stern-personal-technology/apples-software-chief-explains-misunderstood-iphone-child-protection-features-exclusive/573D76B3-5ACF-4C87-ACE1-E99CECEFA82C?mod=hp_lead_pos7
×