this post was submitted on 27 Oct 2023
524 points (94.9% liked)

Technology

59559 readers
3439 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 1 year ago (2 children)

Odd.

I can’t see having a conversation with a computer as having a conversation. I grew up with computers from the Atari stage and played around with several publicly accessible computer programs that you could “chat” with.

They all suck. Doesn’t matter if it’s a “help” program, a phone menu, website help, or even having played around with chatGPT…they’re not human. They don’t respond correctly, they get too general or generic in answers, they repeat, there’s just too many giveaways that you’re not having a real conversation, just responses from a system that’s trying to pick the most likely response that fits the pattern.

So how are people having “conversations” with a non-living entity?

[–] [email protected] 38 points 1 year ago* (last edited 1 year ago) (3 children)

It's escapism I think. At least that's part of it. Having a machine that won't judge you, will serve as a perfect echo chamber, and will immediately tell you AN answer can be very appealing to some. I don't have any data, or any study to back it up, just my experience from seeing it happen.

I have a friend who I feel like I kind of lost to chatgpt. I think he's a bit unhappy with where he is in life. He got the good paying job, the house in the suburbs, wife, and 2.5 kids, but didn't ever think about what was next. Now he's just a bit lost I think, and somehow convinced himself that people weren't as good as chatting with a bot.

It's weird now. He spends long nights and weekends talking to a machine. He's constructed elaborate fictional worlds within his chatgpt history. I've grown increasingly concerned about him, and his wife clearly is struggling with it. He's obviously depressed but instead of seeking help or attempting to figure himself out, he turned to a non-feeling, non-judgmental, emotionless tool for answers.

It's a struggle to talk to him now. It's like talking to a cryptobro at peak btc mania. The only thing that he wants to talk about is LLMs. Trying to bring up that maybe spending all your time talking to a machine is a bit unhealthy invokes his ire and he'll avoid you for several days. Like a herion addict struggling with addiction, even pointing out the obvious flaws in what he's doing makes him distance himself more from you.

I'm not young, not old exactly either, but I've known him for 25 years in my adult life. We met in college and have been friends ever since. I know many won't quite understand but knowing someone that long, and remaining close, talk every few days, friends is quite rare. At this point he is my longest held friendship and I feel like I'm losing him to a robot. I've lost other friends to addiction in my life and to say that it's been similar is under stating it. I don't know what to do for him. I don't know if there's really anything I CAN do for him. How do you help someone that doesn't even think they have a problem?

I guess my point is, if you find someone who is just depressed enough, just stuck enough, with a particular proclivity towards computers/the internet then you have a perfect canidate for falling down the LLM rabbit hole. It offers them an out to feeling like they're being judged. They feel like the insanity it spits out is more sane than how they feel now. They think they're getting somewhere, or at least escaping their current situation. Escapism is very appealing when everything else seems pointless and sort of gray I think. So that's at least one type of person that can fall down the chapgpt/LLM rabbit hole. I'm sure there's others out there too with there own unique motivations and reason's for latching onto LLMs.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (1 children)

Wow, thank you for sharing your experience.

How are you not higher voted. People on Lemmy complain about not having longform content that offers a unique perspective like on early Reddit, but you've written exactly that.

[–] [email protected] 2 points 1 year ago

Unfortunately, our brains like witty clickbait that confirms our biases, regardless of what people say

[–] [email protected] 4 points 1 year ago

Guess that should have crossed my mind. People marrying human-like dolls and all that. One gets so far down the hole of whatever mental issues are plaguing the mind and something inanimate that only reflects what you want to see becomes the preferable reality.

[–] [email protected] 2 points 1 year ago

Awesome perspective! I've worked with and around seriously depressed, possession hoarders for around a year and quite the majority were the type to call you randomly ultimately to chat about something or another. The exact priming situation that would fall into abusing LLM tech if offered easy access to it. This was before the days of Chatgpt but I do worry some of my old clients are falling into this situation but with far less nuance than your friend.

[–] [email protected] -1 points 1 year ago (1 children)
[–] [email protected] 2 points 1 year ago

Until someone(thing?) else comes along we have only ourselves to judge reality. Maybe AI will decide we aren’t real at some point…