this post was submitted on 15 Oct 2024
993 points (99.5% liked)

Technology

58719 readers
5412 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Wayback Machine back in read-only mode after DDoS, may need further maintenance.

you are viewing a single comment's thread
view the rest of the comments
[–] brbposting 2 points 1 day ago (1 children)

I thought of something but I don’t know if it’s a good example.

Here’s the hypothetical:

A criminal backs up a CSAM archive. Maybe the criminal is caught, heck say they’re executed. Pedos can now share the archive forever over encrypted messengers without fear of it being deleted? Not ideal.

[–] [email protected] 1 points 3 hours ago* (last edited 3 hours ago)

Yeah this is a hard one to navigate and it's the only thing I've ever found that challenges my philosophy on the freedom of information.

The archive itself isn't causing the abuse, but CSAM is a record of abuse and we restrict the distribution not because distribution or possession of it is inherently abusive, but because the creation of it was, and we don't want to support an incentive structure for the creation of more abuse.

i.e. we don't want more pedos abusing more kids with the intention of archival/distribution. So the archive itself isn't the abuse, but the incentive to archive could be.

There's also a lot of questions with CSAM in general that come up about the ethics of it in that I think we aren't ready to think about. It's a hard topic all around and nobody wants to seriously address it beyond virtue signalling about how bad it is.

I could potentially see a scenario where the archival could be beneficial to society similar to the FBI hash libraries Apple uses to scan iCloud for CSAM. If we throw genAI at this stuff to learn about it, we may be able to identify locations, abusers and victims to track them down and save people. But it would necessitate the existence of the data to train on.

I could also see potential for using CSAM itself for psychotherapy. Imagine a sci-fi future where pedos are effectively cured by using AI trained on CSAM to expose them to increasingly mature imagery, allowing their attraction to mature with it. We won't really know if something like that is possible if we delete everything. It seems awfully short sighted to me to delete data no matter how perverse, because it could have legitimate positive applications that we haven't conceived of yet. So to that end, I do hope some 3 letter agencies maintain their restricted archives of data for future applications that could benefit humanity.

All said, I absolutely agree that the potential of creating incentives for abusers to abuse is a major issue with immutable archival, and it's definitely something that we need to figure out, before such an archive actually exists. So thank you for the thought experiment.