this post was submitted on 29 Sep 2024
471 points (97.2% liked)

People Twitter

5369 readers
1770 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
 

Taylor & Francis and Wiley sold out their researchers in bulk, this should be a crime.

Researchers need to be able to consent or refuse to consent and science need to be respected more than that.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 2 months ago* (last edited 2 months ago)

Speaking of fearmongering, you note that:

an artist getting their style copied

So if I go to an art gallery for inspiration I must declare this in a contract too? This is absurd. But to be fair I’m not surprised. Intellectual property is altogether an absurd notion in the digital age, and insanity like “copyrighting styles” is just the sharpest most obvious edge of it.

I think also the fearmongering about artists is overplayed by people who are not artists.

Ignoring the false equivalency between getting inspiration at an art gallery and feeding millions of artworks into a non-human AI for automated, high-speed, dubious-legality replication and derivation, copyright is how creative workers retain their careers and find incentivization. Your Twitter experiences are anecdotal; in more generalized reality:

  1. Chinese illustrator jobs purportedly dropped by 70% in part due to image generators
  2. Lesser-known artists are being hindered from making themselves known as visual art venues close themselves to known artists in order to reduce AI-generated issues -- the opposite of democratizing art
  3. Artists have reported using image generators to avoid losing their jobs
  4. Artists' works, such as those by Hollie Mengert and Karen Hallion among others, have been used without their compensation, attribution, nor consent in training data -- said style mimicries have been described as "invasive" (someone can steal your mode of self-expression) and reputationally damaging -- even if the style mimicries are solely "surface-level"

The above four points were taken from the Proceedings of the 2023 AIII/ACM Conference on AI, Ethics, and Society (Jiang et al., 2023, section 4.1 and 4.2).

Help me understand your viewpoint. Is copyright nonsensical? Are we hypocrites for worrying about the ways our hosts are using our produced goods? There is a lot of liability and a lot of worry here, but I'm having trouble reconciling: you seem to be implying that this liability and worry are unfounded, but evidence seems to point elsewhere.

Thanks for talking with me! ^ᴗ^

(Comment 2/2)