So we can sue robots but when I ask if we can tax them, and reduce human working hours, I'm the crazy one?
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
💰 Please help cover server costs.
Ko-fi | Liberapay |
So we can sue robots
... No?
What would you tax exactly? Robots don't earn an income and don't inherently make a profit. You could tax a company or owner who profits off of robots and/or sells their labor.
What would be the legal argument for this? I'm not against it but I don't know how it could be argued.
Legal basis for suing a company that uses another company’s product/creations without approval seems like a fairly pretty straightforward intellectual property issue.
Legal basis for increased taxes on revenue from AI or automation in general could be created in the same way that any tax is created: through legislation. Income tax didn’t exist before, now it does. Tax breaks for mortgage interest didn’t use it exist, now it does. Levying higher taxes on companies that rely heavily on automated systems and creating a UBI out of the revenue might not exist right now, but it can, if the right people are voted in to pass the right laws.
I'm no expert on law but maybe something about AI unethically taking our jobs away
Universal base income + AI/robots taking care of all necessary jobs sounds great
Thats exactly what Andrew Yangs political platform was. I hope he runs again
China didn't take your job and neither will AI. Corporations will replace you for something that cost less.
We can't really legislate against AI because other countries won't. Its also a huge boon for society, we just have to make sure the profits are redistributed and work hours overall are reduced instead of all the productivity gain going into the pockets of the mega wealthy
"Massive Trouble"
Step 1 - Scrape everyone's data to make your LLM and make a high profile deal worth $10B
Step 2 - Get sued by everyone whose data you scraped
Step 3 - Settle and everyone in the class will be eligible for $5 credit using ChatGPT-4
Step 4 - Bask in the influx of new data
Step 5 - Profit
i posted on the public internet with the intent and understanding that it would be crawled by systems for all kinds of things. if i dont want content to be grabbed i dont publish it publicly
you can't easily have it both ways imo. even with systems that do strong pki if you want the world in general to see it you are giving up a certain amount of control over how the content gets used.
law does not really matter here as much as people would like to try to apply it, this is simply how public content will be used. Go post in a garden if you don't want to get scrapped, just remember the corollary is your reach, your voice is limited to the walls of that garden.
What you said makes a lot of sense. But here's the catch: it assumes OpenAI checked the licensing for all the stuff they grabbed. And I can guarantee you they didn't.
It's damn near impossible to automatically check the licensing for all the stuff they got she we know for a fact they got stuff whose licensing does not allow it to be used this way. Microsoft has already been sued for Copilot, and these lawsuits will keep coming. Assuming they somehow managed to only grab legit material and they used excellent legal advisors that assured them out would stand in court, it's definitely impossible to tell what piece of what goes where after it becomes a LLM token, and also impossible to tell what future lawsuits will decide about it.
Where does that leave OpenAI? With the good ol' "I grabbed something off the internet because I could". Why does that sound familiar? It's something people have been doing since the internet was invented, it's commonly referred to as "piracy". But it's supposed to be wrong and illegal. Well either it's wrong and illegal for everybody or the other way around.
there were court cases around this very thing and google and webarchive. I suspect thier legal team is expecting similar precedent with the issue being down to the individual and how they use the index, example, using it to make my own unique character (easily done) vs making an easy and obvious rip off of a Disney property. The same tests can be applied, the question IMO isn't about the index that is built here. I can memorize a lot (some people have actual eidetic memory) and synthesize it too which is protected and I can copyright my own mental outputs. The disposition of this type of output vs mechanical outputs i expect will be where things end up being argued.
I'm not going to say I'm 100% right here, we are in a strange timeline but there is precedent for what OAI is doing IMO.
People talk about OpenAI as if its some utopian saviour that's going to revolutionise society. When in reality its a large corporation flooding the internet with terrible low-quality content using machine learning models that have existed for years. And the fields it is "automating" are creative ones that specifically require a human touch, like art and writing. Language learning models and image generation isn't going to improve anything. They're not "AI" and they never will be. Hopefully when AI does exist and does start automating everything we'll have a better economic system though :D
If this lawsuit is ruled in favor of the plaintiff, it might lead to lawsuits against those who have collected and used private data more maliciously, from advertisement-targeting services to ALPR services that reveal to law enforcement your driving habits.
So some of the most profitable corporations in the world? In that case this lawsuit isn't going anywhere.
Good. Technology always makes strides before the law can catch up. The issue with this is that multi million dollar companies use these gaps in the law to get away with legally gray and morally black actions all in the name of profits.
Edit: This video is the best way to educate yourself on why ai art and writing is bad when it steals from people like most ai programs currently do. I know it's long, but it's broken up into chapters if you can't watch the whole thing.
Totally agree. I don't care that my data was used for training, but I do care that it's used for profit in a way that only a company with big budget lawyers can manage
I'm honestly at a loss for why people are so up at arms about OAI using this practice and not Google or Facebook or Microsoft, ect. It really seems we're applying a double standard just because people are a bit pissed at OpenAI for a variety of reasons, or maybe just vaguely mad at the monetary scale of "tech giants"
My 2 cents: I don't think content posted on the open internet (especially content produced by users on a free platform being claimed not by those individuals but by the platforms themselves) should be litigated over, when that information isnt even being reproduced but being used on derivative works. I think it's conceptually similar to an individual reading a library of books to become a writer and charge for the content they produce.
I would think a piracy community would be against platforms claiming ownership over user generated content at all.
I once looked outside. Could I be sued for observing a public space?
i once looked at a picture of spider man and badman then made a crappy drawing biterman
to jail with me!
It’s wild to see people in the piracy community of all places have an issue with someone benefiting from data they got online for free.
Key difference is that they're making (alot of) money of off the stolen work, and in a way that's only possible for the already filthy rich
Wouldn't mind it personally if it was foss though, like their name suggests
FWIW even if it was FOSS I'd still care. For me it's more about intent. If your business model/livelihood relies on stealing from people there's a problem. That's as true on a business level as it is an individual one.
Doesn't mean I have an answer as sometimes it's extremely complex. The easy analogy is how we pirate TV shows and movies. Netflix originally proved this could be mitigated by providing the material cheaply and easily. People don't want to steal (on average).
I find people in general are much more willing to part with their money than the big corps think. I'll even go to the extent to say that we enjoy doing so. Just look at Twitch -- tonnes of money are thrown at streamers because it's fun and convenient, or at TikTok vendors selling useless stuff on live streaming. We just don't like to be lied to and treated like cash cows.
They're using people's content without authorization, but for a open information ideology or something like that, they are closed source and they are using it to make money. I don't think that should be illegal, but it is certainly a dick move
The difference is that they are profitting from other people's work and properties, I don't profit from watching a movie or playing a game for free, I just save some money.
Many of us are sharing without reward and have strong ethical beliefs regarding for-profit distribution of material versus non-profit sharing.
It really isn't that bonkers. A lot software thought is about licensing. See GPL and Creative Commons and all that stuff thats all about how things can be profited from/responsibilities around it. Benefiting from free data is one thing. Privately profiting at the expense or not sharing the capability/advances that came from it is another. Willing to bet there's GPL violations via the training sets.
Is it even possible to attach licenses to text posts on social media?
I don't see how this is any different than humans copying or being inspired by something. While I hate seeing companies profiting off of the commons while giving nothing of value back, how do you prove that an AI model is using your work in any meaningful or substantial way? What would make me really mad is if this dumb shit leads to even harsher copyright laws. We need less copyright not more.
Hope it goes through and sets a president.
Vote Skynet for 2024 Presidential Election, the efficient choice!