this post was submitted on 02 Jul 2023
141 points (96.1% liked)

Technology

58011 readers
3147 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

OpenAI's ChatGPT and Sam Altman are in massive trouble. OpenAI is getting sued in the US for illegally using content from the internet to train their LLM or large language models

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

AI is not human. It doesn’t learn like a human. It mathematically uses what it’s seen before to statistically find what comes next.

AI isn’t learning, it’s just regurgitating the content it was fed in different ways

[–] [email protected] 1 points 1 year ago (1 children)

But is the output original? That’s the real question here. If humans are allowed to learn from information publicly available, why can’t AI?

[–] [email protected] -1 points 1 year ago (3 children)

No, it isn’t original. Output of AI is just reorganized content that it already has seen.

AI doesn’t learn, it doesn’t create derivative works. It’s nothing more than reshuffling what it’s already seen, to the point that it will frequently use phrases pulled directly from training data.

[–] [email protected] 2 points 1 year ago (1 children)

You are saying that it isn’t original content because AI can’t be original. I’m saying if the content isn’t distinguishable from original content, and can’t be directly traced to the source, in what way is it not original?

[–] [email protected] 2 points 1 year ago

Because it’s still not creating anything. AI can’t create, it just reorganizes.

[–] [email protected] 0 points 1 year ago

I think you hear a lot of college students say the same thing about their original work.

What I need to see is output from an AI, and the original content side by side and say “yeah, the AI ripped this off”. If you can’t do that, then the AI is effectively emulating human learning.

[–] [email protected] -1 points 1 year ago (1 children)

I think you hear a lot of college students say the same thing about their original work.

What I need to see is output from an AI, and the original content side by side and say “yeah, the AI ripped this off”. If you can’t do that, then the AI is effectively emulating human learning.

[–] [email protected] 1 points 1 year ago

No it isn’t

AI is math. That’s it. This over humanization is crazy scary that people can’t see the difference. It does not learn like a human.

https://www.vice.com/en/article/m7gznn/ai-spits-out-exact-copies-of-training-images-real-people-logos-researchers-find

https://techcrunch.com/2022/12/13/image-generating-ai-can-copy-and-paste-from-training-data-raising-ip-concerns/amp/

https://www.technologyreview.com/2023/02/03/1067786/ai-models-spit-out-photos-of-real-people-and-copyrighted-images/amp/

It’s a well established problem. Tech companies have explicitly told employees to not use these services on company hardware or servers. The data is not abstracted from the user and it’s been proven to output data that’s been inputted.