this post was submitted on 26 Jul 2023
859 points (96.5% liked)
Technology
59689 readers
4100 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Typically the argument has been "a robot can't make transformative works because it's a robot." People think our brains are special when in reality they are just really lossy.
Even if you buy that premise, the output of the robot is only superficially similar to the work it was trained on, so no copyright infringement there, and the training process itself is done by humans, and it takes some tortured logic to deny the technology's transformative nature
Oh i think those people are wrong, but we tend to get laws based on people who don't understand a topic deciding how it should work.
Go ask ChatGPT for the lyrics of a song and then tell me, that's transformative work when it outputs the exact lyrics.
Go ask a human for the lyrics of a song and then tell me that's transformative work.
Oh wait, no one would say that. This is why the discussion with non-technical people goes into the weeds.
Because it would be totally clear to anyone that reciting the lyrics of a song is not a transformative work, but instead covered by copyright.
The only reason why you can legally do it, is because you are not big enough to be worth suing.
Try singing a copyrighted song in TV.
For example, until it became clear that Warner/Chappell didn't actually own the rights to "Happy Birthday To You", they'd sue anyone who sung that song in any kind of broadcast or other big public thing.
Quote from Wikipedia:
So if a human isn't allowed to reproduce copyrighted works in a commercial fashion, what would make you think that a computer reproducing copyrighted works would be ok?
And regarding derivative works:
Check out Vanilla Ice vs Queen. Vanilla Ice just used 7 notes from the Queen song "Under Pressure" in his song "Ice Ice Baby".
That was enough that he had to pay royalties for that.
So if a human has to pay for "borrowing" seven notes from a copyrighted work, why would a computer not have to?
The key there is anyone profiting from the copyrighted work. I've been to big public events where the have sung Happy Birthday, things that may very have been recorded but none of us were sued because there was no damages, no profits lost.
The other big question is what are these lawsuits basing their complaint on. If I understand the Sarah Silverman claim is that she could go into ChatGPT and ask it for pages from her book and it generated them. Never once have i used ChatGPT and had it generate pages from her book so the question is the difference between my and her experience? The difference is she asked for that material. This may seem trivial but on the basis of how the technology works it's important.
You can go through their LLM and no where will you find her book. No where will you find pages of her book. No where will you find encoded or encrypted versions of her book. Rather, you'll find a data model with values showing the probability of a text output for given prompts. The model sometime generates valid responses and sometimes it gives wrong answers. Why? Because its a language model and not a library of text.
So the question now becomes, what is it the content creators are upset about? The fact that they asked it to generate content that turned out to match their own or that their content was used to teach the LLM. Because in no case is there a computer somewhere that has their text verbatim existing somewhere waiting to be displayed. If its about the output then I'd want to know how this is different than singing happy birthday. If I'm prompting the AI and then there are no damages, i don't use it for anything of fiduciary gains I'm not seeing an issue.
Well, they're fixing that now. I just asked chatgpt to tell me the lyrics to stairway to heaven and it replied with a brief description of who wrote it and when, then said here are the lyrics: It stopped 3 words into the lyrics.
In theory as long as it isn't outputting the exact copyrighted material, then all output should be fair use. The fact that it has knowledge of the entire copyrighted material isn't that different from a human having read it, assuming it was read legally.
This feels like a solution to a non-problem. When a person asks the AI "give me X copyrighted text" no one should be expecting this to be new works. Why is asking ChatGPT for lyrics bad while asking a human ok?
It's not legal for anyone (human or not) to put song lyrics online without permission/license to do so. I was googling this to make sure I understood it correctly and it seems that reproducing the lyrics to music without permission to do so is copyright infringement. There are some lyrics websites that work with music companies to get licensing to post lyrics but most websites host them illegally and will them then down if they receive a DMCA request.
Wait wait wait. That is not a good description of what is happening. If you and i are in a chat room and you asked me the lyrics, my verbalization of them isn't an issue. The fact it is online just means the method of communication is different but that should be covered under free use.
The AI is taking prompts and proving the output as a dialog. It's literally a language model so there is a process of synthesizing your question and generating your output. I think that's something people either don't understand or completely ignore. Its not as if there are entire books verbatim stored as a contiguous block of data. The data is processed and synthesized into a language model that then generates an output that happens to match the requested text.
This is why we cant look at the output the same way we look at static text. In theory if you kept training it in a way then opposed the statistical nature of your book or lyrics you could eventually get to the point where asking the AI to generate your text would give a non-verbatim answer.
I get that this feels like semantics but creating laws that don't understand the technology means we end up screwing ourselves over.
I get how LLMs work and I think they're really cool. I'm just trying to explain why OpenAI is currently limiting these abilities to continue operating within our legal system. Hopefully the court cases find that there is in fact a difference between publishing the information on a normal website versus discussing it with a chatbot so that they don't have to be limited like this.
Publishing lyrics publicly online is illegal while communicating them privately in a chatroom is probably fine. Communicating them in a public forum is a grey area, but you likely won't be caught or prosecuted. If a big company hosts an AI chatbot which can tell you the lyrics to any song on demand, then that seems like an illegal service unless they have the rights.
Feel free to look up the legality of publishing lyrics online, all I saw was information saying that it is illegal but they don't prosecute anyone but the larger companies.
I guess my question is why does it seem like an illegal service? Not saying it isn't but it feels like non technical people will say "it knows the lyrics and can tell me them so it must contain them."
To me the technology is moving closer to mimicking human memory than just plain storage retrieval. ChatGPT gets things wrong often because that process of presenting data is not copying but generation. The output is the output so presenting anything copyright falls under the appropriate laws but until the material is actually presented some of the arguments being made feel wrong. If i can read a book and then write anything, the fact your story is in my head shouldn't be a problem. If you prompt the AI for a book...isn't that your fault by asking?
Try it again and when it stops after a few words, just say "continue". Do that a few times and it will spit out the whole lyrics.
It's also a copyright violation if a human reproduces memorized copyrighted material in a commercial setting.
If, for example, I give a concert and play all of Nirvana's songs without a license to do so, I am still violating the copyright even if I totally memorized all the lyrics and the sheet music.