this post was submitted on 16 Jul 2023
85 points (91.3% liked)
Asklemmy
43989 readers
634 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Large language models (what marketing departments are calling "AI") cannot synthesize new ideas or knowledge.
Don’t know what you are talking about. GPT-4 absolutely can write new stories. What differentiates that from a new idea?
I can't tell whether you're saying I don't know what I'm talking about, or you don't know what I'm talking about.
Doesn’t matter.
When in conversation the “AI can’t have creativity/new ideas etc” argument comes up, I often get the impression it’s a protective reaction rather than a reflected conclusion.
Physician, heal thyself, then.
First off all, yes they can for all practical purposes. Or, alternately, neither can humans. So the point is academic. There is little difference between the end result from an AI and a human taken at random.
Secondly, LLMs aren't really what people are talking about when they talk about AI art.
Not even the AI companies' marketing departments go that far.