this post was submitted on 02 Apr 2024
314 points (95.9% liked)
Technology
60047 readers
3171 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Well maybe stop shoving the tech that does that down everyone's throats? Just a thought ๐คทโโ๏ธ
The best solution to any problem is to go back in time to before the problem was created, sure. That cat's so far out of the bag, and it's only going to multiply and evolve.
I mean, yeah that's true, but harm reduction is also a thing that exists. Usually it's mentioned in the context of drugs, but it could easily apply here.
Interesting take, addiction to the convenience provided by AI driving the need to get more. I suppose at the end of the day it's probably the same brain chemistry involved. I think that's what you're getting at?
I'm any case, this tech is only going to get better, and more commonplace. Take it, or run for the hills.
Ah, so more like self-harm prevention, gotcha.
I guess like any tool, whether it is help or harm depends on the user and usage.
Oh, right. Microsoft is a corp. They don't care about the harm they do until it costs them money.
e: also, I love to bash on ms, but they're not the problem here. These things are being built all over the place.. In companies, in governments, in enthusiasts back yard. You can tell Microsoft, Google, Apple to stop developing the code, you can tell nvidia to stop developing cuda. It's not going to matter.
I suppose, could be harm reduction. Like peeling a bandaid off slowly instead of ripping it off.
They're here, they might not be everywhere yet, but they're here to stay as much as photoshopped images or trick photography are. Just more lies to hide the truth.
All we can do now is get better at dealing with them.
I'm heading for the hills then. I'm perfectly capable of thinking for myself without delegating that to some chatbot.
Everyone is. As time and tech progresses, you're going to find that it becomes increasingly difficult to avoid without going off-grid entirely.
Do you really think corps aren't going to replace humans with AI, any later than they can profit by doing so? That states aren't going eventually to do the same?
What we really need to do is destroy those pesky textile machines.