this post was submitted on 12 Jun 2023
11 points (100.0% liked)

Lemmy.World Announcements

29008 readers
2 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news ๐Ÿ˜

Outages ๐Ÿ”ฅ

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to [email protected] e-mail.

Report contact

Donations ๐Ÿ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 1 year ago
MODERATORS
 

One of the arguments made for Reddit's API changes is that they are now the go to place for LLM training data (e.g. for ChatGPT).

https://www.reddit.com/r/reddit/comments/145bram/addressing_the_community_about_changes_to_our_api/jnk9izp/?context=3

I haven't seen a whole lot of discussion around this and would like to hear people's opinions. Are you concerned about your posts being used for LLM training? Do you not care? Do you prefer that your comments are available to train open source LLMs?

(I will post my personal opinion in a comment so it can be up/down voted separately)

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Well, these AIs are being trained on public figures, and there isn't much they can do unless they livestream with the AI impersonating them, allowing them to potentially identify who is behind it. How will people figure out if there's an LLM out there that speaks just like them? It's similar to fine-tuning AIs on artists to create art that mimics their style. It can be frustrating, but there isn't much anyone can do unless surveillance software is installed on every computer. In summary, I don't mind because I won't even find out.