this post was submitted on 06 Sep 2024
240 points (96.9% liked)

Technology

59675 readers
3226 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 2 months ago (1 children)

That's easy. The movie studios know what post-production went into the scenes and have the documents to prove it. They can easily prove that such clips fall under deepfake laws.

Y'all need to be more cynical. These lobby groups do not make arguments because they believe in them, but because it gets them what they want.

[–] [email protected] 2 points 2 months ago (1 children)

I was responding to an above comment. The guy who was arrested in op's article was posting clips from movies (so not deep fakes).

That being said, for deepfakes, you'd need the original video to prove it was deepfaked. Additionally, you'd then probably need to prove they used a real person to make the deep fake. Nowadays it's easy to make "fake" people using AI. Not sure where the law sits on creating deepfakes of fake people who resemble other people.

[–] [email protected] 2 points 2 months ago

I didn't make the point clear. The original scenes themselves, as released by the studio, may qualify as "deepfakes". A little bit of digital post-processing can be enough to qualify them under the proposed bills. Then sharing them becomes criminal, fair use be damned.