this post was submitted on 12 Jul 2024
564 points (98.3% liked)
Technology
60116 readers
2389 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't like AI but I hate intellectual property. And the people that want to restrict AI don't seem to understand the implications that has. I am ok with copying as I think copyright is a load of bullocks. But they aren't even reproducing the content verbatim are they? They're 'taking inspiration' if you will, transforming it into something completely different. Seems like fair use to me. It's just that people hate AI, and hate the companies behind it, and don't get me wrong, rightfully so, but that shouldn't get us all to stop thinking critically about intellectual property laws.
I'm the opposite, actually. I like generative AI. But as a creator who shares his work with the public for their (non-commercial) enjoyment, I am not okay with a billionaire industry training their models on my content without my permission, and then use those models as a money machine.
This law will ensure only giant tech company have this power. Hobbyists and home players will be prevented.
What are you basing that on?
Doesn't say anything about the right just applying to giant tech companies, it specifically mentions artists as part of the protected content owners.
That's like saying you are just as protected regardless which side of the mote you stand on.
It's pretty clear the way things are shaping up is only the big tech elite will control AI and they will lord us over with it.
The worst thing that could happen with AI. It falling into the hands of the elites, is happening.
I respectfully disagree. I think small time AI (read: pretty much all the custom models on hugging face) will get a giant boost out of this, since they can get away with training on "custom" data sets - since they are too small to be held accountable.
However, those models will become worthless to enterprise level models, since they wouldn't be able to account for the legality. In other words, once you make big bucks of of AI you'll have to prove your models were sourced properly. But if you're just creating a model for small time use, you can get away with a lot.
I am skeptical that this is how it will turn out. I don't really believe there will be a path from 0$ to challenging big tech without a roadblock of lawyers shutting you down with no way out on the way.
I don't think so either, but to me that is the purpose.
Somewhere between small time personal-use ML and commercial exploitation, there should be ethical sourcing of input data, rather than the current method of "scrape all you can find, fuck copyright" that OpenAI & co are getting away with.
I mean this is exactly the kind of regulation that microsoft/openai is begging for to cement their position. Then is going to be just a matter of digesting their surviving competitors until only one competitor remains, similar to Intel / AMD relationship. Then they can have a 20 year period of stagnation while they progressively screw over customers and suppliers.
I think that's the bad ending. By desperately trying to keep the old model of intellectual property going, they're going to make the real AI nightmare of an elite few in control of the technology with an unconstrained ability to leverage the benefits and further solidifying their lead over everyone else.
The collective knowledge of humanity is not their exclusive property. It also isn't the property of whoever is the lastest person to lay a claim to an idea in effective perpetuity.
Why?
Once this passes, OpenAI can't build ChatGPT on the same ("stolen") dataset. How does that cement their position?
Taking someone's creation (without their permission) and turning it into a commercial venture, without giving payment or even attribution is immoral.
If a creator (in the widest meaning of the word) is fine with their works being used as such - great, go ahead. But otherwise you'll just have to wait before the work becomes public domain (which obviously does not mean publicly available).
Just because intellectual property laws currently can be exploited doesnt mean there is no place for it at all.
That's an opinion you can have, but I can just as well hold mine, which is that restricting any form of copying is unnatural and harmful to society.
Do you believe noone should be able to charge money for their art?
That's right. They can put their art up for sale, but if someone wants to take a free copy nothing should be able to stop them.
That effectively makes all art free. At best its donation based.
Yes, that would be best.
That would lead to most art being produced by people who are wealthy enough to afford to produce it for free, wouldn't it?
What incentive would a working person have to work on becoming an artist? Its not like artists are decided at birth or something.
Most people who make art don't make any money from it. Some make a little bit of money. A small number of people can afford a living just by making art, and just a fraction of that actually get most of the money that's being earned by artists, and then of course there is a lot of money that's being paid for art that never reaches the artist. The business as it is is not working very well for anyone except for some big media companies. The complete lack of commercial success hasn't stopped a lot of artists, it won't stop them in the future. Thank god, because it wouldn't be the first time that after decades of no commercial success whatsoever such an outsider is discovered by the masses. Sure, lack of commercial success has stopped others, but that's happening now just as it will happen without copyright laws. If donating to artists out of free will would be the norm, and people knew that that's the main source of income for certain types of artists, then I'm sure a lot of people would do so. And aside from private donations there could be governments and all sorts of institutions financing art. And if someone still can't make a living, then still none of that could legitimize copyright in my view. We should strive for a world where everyone that wants to follow up on their creative impulses has time and opportunity to do so, irrespective of their commercial success. But there should also be unrestricted access to knowledge, ideas, art, etc. Brilliant research, photography or music shouldn't be reserved for those who can afford access. The public domain should be the norm so that our shared project of human creativity can reach maximum potential. Copyright seems to me to be a rather bizarre level of control over the freedom of others. It's making something public for others to see, but then telling these people you're not allowed to be inspired by it, you can't take a free copy to show others, you can't take the idea and do with it as you please. It's severely limiting us culturally, it's harming human creativity. And at the same time it's hypocritical. Artistic ideas are often completely based of the ideas of others, everyone can see that the output is the result of a collective effort. The Beatles didn't invent pop music, they just made some songs, precisely copying all that came before them, and then added a tiny bit of their own. And that's not a criticism, that's how human creativity functions. That's what people should strive for. To limit copying, is to limit humanity in it's core. Again, human creativity is very clearly a collective effort. But despite this fact, when someone gets successful suddenly it's a personal achievement and they are allowed to ask for a lot of money for it. Well my answer is, yes they are allowed to ask, and I am very willing to pay, but they shouldn't be allowed to go beyond asking, they shouldn't be allowed to restrict access of something that has been published.
What if there was some sort of model that would pay an artist outright for their contributions now and into the future. Like crowdsourcing art from your favorite artists.
It might cost a lot if a lot of people want something from them of course, if demand is high. They might even work out a limited payment scheme where you pay for limited access to the art for less.
Sound a lot like we have now?
And right now, I have to disagree, most artists create with the hope they can make big money, which wouldnt exist without artists who make big money. All artists should be making more money, and even the wealthy artists now have people above them making more money than them who have nothing to do with art.
We dont need to throw out all of our ideas, we just need to keep increasing visibility into industries and advocating for the artist (or the entry level worker, or the 9-5ers, or any other of those who produce everything a company profits off of but are unfairly compensated for it).
For you to argue AI will help artists is absurd. They've been stolen from, and now the result of that theft is driving them out of work. It only is good for artists if by artists you mean yourself, and anyone else who only cares about the self. Same people who tend to use societal arguments only when it benefits them somehow, which is ironic isnt it?
True but you people have had hundreds of years to fix the system and have not.
That is not at all what takes place with A.I.
An A.I. doesn't "learn" like a human does. It aggregates multiple chunks from multiple sources. It's just really really tiny chunks so it's hard to tell sometimes.
That's why you can ask two AI's to write a story based on the same prompt and some of their lines will be exactly the same. Because it's not taking inspiration from, it's literally copying bits and pieces of other works and it happens that they both chose that particular bit.
If you do that when writing a paper in university it's called plagerism.
Get the fuck out of here with your "A.I. takes inspiration.." it copies nothing more. It doesn't add anything new to the sum total of the creative zeitgeist because it's just remixes of things that already exist.
So it does do more than copying? Because as you said - it remixes.
It sounds like the line you're trying to draw is not only arbitrary, but you yourself can't even stick with it for more than one sentence.
Everything new is some unique combination of things that already exist, the elements it draws from are called sources and influences, and rules according to which they're remixed are called techniques/structures e.g. most movies are three acts, and many feature specific techniques like J-cuts.
Heck even re-arranging elements of just one thing is a unique and different thing, or is your favourite song and a remix of it literally the same? Or does the remix not have artistic value, even though someone out there probably likes the remix, but not the original?
I think your confusion stems from the fact you're a top shelf, grade-A Moron.
You're an organic, locally sourced and ethically produced idiot, and you need to learn how basic ML works, what "new" is, and glance at some basic epistemology and metaphysics before you lead us to ruin because you don't even understand what "new" entails, before your reactionary rhetoric leads us all down straight to cyberpunk dystopias.
Damn, attack the argument, not the person, homie.
Yeah, sorry
Consider youtube poop, Im serious. Everyclip in them is sourced from preexisting audio and video, and mixed or distorted in a comedic format. You could make an AI to make youtube poops using those same clips and other "poops" as training data. What it outputs might be of lower quality, but in a technical sense it would be made in an identical fashion. And, to the chagrin of Disney, Nintendo, and Viacom, these are considered legally distinct entities; because I dont watch Frying Nemo in place of Finding Nemo. So why would it be any different when an AI makes it?
You just reiterate what other anti-ML extremists have said like a sad little parrot. No, LLMs don't just copy. They network information and associations and can output entirely new combinations of them. To do this, they make use of neural networks, which are computational concepts analogous to the way your brain works. If, according to you, LLMs just copy, then that's all that you do as well.
You can do the same thing with the Hardy Boys. You can find the same page word for word in different books. You can also do that with the Bible. The authors were plagiarizing each other.
Do yourself a favor and never ever go into design of infrastructure equipment or eat at a Pizza Hut or get a drink from Starbucks or work for an American car company or be connected to Boeing.
Everyone has this super impressive view of human creativity and I am waiting to see any of it. As far as I can tell the less creative you are the more success you will have. But let me guess you ride a Segway, wear those shoes with toes, have gone through every recipe of Julia Childs, and compose novels that look like Finnegan's Wake got into a car crash with EE Cummings and Gravity's Rainbow.
Now leave me alone with I eat the same burger as everyone else and watch reruns of Family Guy in my house that looks like all the other ones on the street