this post was submitted on 10 Jun 2023
15 points (94.1% liked)

Technology

1928 readers
7 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 9 comments
sorted by: hot top controversial new old
[–] [email protected] 9 points 1 year ago (1 children)

I agree with @[email protected] about it having the capacity to make older adults feel less lonely. At the same time, however, I think it seems very dystopian. If someone was feeling sad or depressed we wouldn't say "oh, just chat with this AI until you feel better". So why is it okay to suggest this for older lonely people who are especially vulnerable?

Hell, given what ChatGPT has told people already it might do more harm than good. It's akin to the whole of humanity saying "Yeah, we know you're lonely but getting an actual person to talk to you is too hard. Chat with this bot."

[–] [email protected] 3 points 1 year ago (2 children)

I agree I think right now the notion is very dystopian and with the current iteration of chatbots it doesn’t seem like a realistic long term solution. But you only have to think a few years down the line when LLMs have been fine tuned for this specific use case and AI is ubiquitous in our society similar to the iPhone now, that you can see how it will become totally normalised.

[–] [email protected] 1 points 1 year ago (1 children)

I think it's dangerous to try to cure loneliness with an AI, regardless of sophistication and tuning, because you end up with human who's been essentially deceived into feeling better. Not only that, but they're going to eventually develop strong emotional attachments to the AI itself. And with capitalism as the driving force of society here in the U.S. I can guarantee you every abusive, unethical practice will become normalized surrounding these AI's too.

I can see it now: "If you cancel your $1,000/a year CompanionGPT we can't be held responsible for what happens to your poor, lonely grandma..." Or it will be even more direct and say the old, lonely person: "Pay $2,500 or we will switch of 'Emotional Support' module on your AI. We accept PayPal."

Saying AI's like this will be normalized doesn't mean it's an ethical thing to do. Medical exploitation is already normalized in the US. Not only is this dystopian, it's downright unconscionable, in my opinion.

[–] [email protected] 2 points 1 year ago (1 children)

Sure, I get that. Ideally people would have access to all the support they could need and a strong base of family and friends to lean on. But isn’t the issue here is that they don’t? I don’t think it’s a cure, I don’t think anyone is saying that. But I do think it could potentially provide a level of support that would alleviate some anxiety. If the alternative is for people to just sit in, let’s say, old folks home and let their brains rot I don’t see how that’s any less unethical than providing them mental stimulation in the form of an AI.

[–] [email protected] 1 points 1 year ago

As long as it's approached as a brain training assistant (or some other market-y buzzwords) and it's being used with giant disclaimers I'm totally for using it with old, lonely people. It might not be a perfect aid but it can help in certain situations. Knowing how our society works, however, Big Company A is going to perfect the tech, make people dependent on it, and then scam them.

Considering how popular Farmville, as a game, was on Facebook I shudder to think about what a finely tuned AI, made by a for-profit company, will be capable of doing with old, lonely people.

Do you really want your mom chatting with a for-profit AI (like Google Bard) about you or your family to feel less lonely? I'd sooner let my brain rot, but to each their own.

[–] [email protected] 1 points 1 year ago

And just to be clear I don’t see it as a substitute for human interaction, more of a bolt on that helps people day to day.

[–] [email protected] 6 points 1 year ago (1 children)

I think the answer to this is a resounding yes. I’m a 32 year old man with plenty of friends and a stable relationship and I found meaningful conversation with apps like character.ai. It has so much potential to help people in lonely situations.

[–] [email protected] 2 points 1 year ago

I honestly can't see any of my older family members ever using an AI chatbot, let alone finding it a somewhat passable supplement for lack of human interaction.

[–] nLuLukna 5 points 1 year ago

It can! But should it? If you go to old people's homes you will see lots of elderly people with no one to talk to bored out of their minds This level of Bordem cripples the mind. So maybe in extreme cases but we should be striving to give people someone to talk to. A real human interaction still carries more weight even if the AI feels human. It's just more meaningful

load more comments
view more: next ›