this post was submitted on 18 Oct 2023
539 points (98.2% liked)

Technology

59689 readers
3370 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -2 points 1 year ago (1 children)

Please use up to date sources. (Disclaimer: Apple has continued and cancelled this "feature" enough times I'm not 100% sure if it's currently in iOS, but I'm certain enough to not trust any Apple devices with any photos.)

The hashing algorithm they used had manually craftable hash collisions. Apple did state they would be using a different hashing algorithm, but it likely contains similar flaws. This would allow anyone to get your iPhone at least partially flagged, and have your photos sent to Apple for "human verification". Knowing how this algorithm works also allows people to circumvent any detection methods Apple uses.

Not every iPhone is going to include a list of hashes of all illegal material, which means the hash of every image you view is sent to Apple. Even if you trust them to not run any other tracking/telemetry on your iPhone, this alone gives them the ability to track who viewed any image, by only having a copy of the image themselves. This is a very powerful surveillance tool, and can be used for censorship of nearly anything.