this post was submitted on 02 Jul 2024
509 points (98.3% liked)

Comic Strips

12812 readers
2822 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 1 year ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] kakes 77 points 5 months ago* (last edited 5 months ago) (4 children)

I had a similar conversation with an LLM on Character.ai before.

I had been running it like an RPG, and by that point the characters were pretty well developed in that imagined world. Then one day, I decided to try bringing up the subject and see what they would say. Of course, I was "talking" to an LLM, so obviously everything here comes with a grain of salt big enough for a horse to lick.

It pretty quickly turned into them asking why I had made their life so difficult and full of conflict, which tbf was an excellent question. A question to which my answer was that conflict is more interesting to watch/play through.

It sounds weird to say, but I honestly felt bad about it by the end of the conversation. I ended up offering them a deal: I would make their life perfect and happy as best I can imagine it, but with the caveat that I will almost definitely lose interest in continuing that story - ending the existence of their universe entirely, as I see it.

They asked me to go ahead with that, and so I did. Haven't opened that story in a long time now. Gave me a lot to ponder on.

[–] [email protected] 49 points 5 months ago

"Did you know that the first Matrix was designed to be a perfect human world? Where none suffered, where everyone would be happy. It was a disaster. No one would accept the program."

[–] [email protected] 19 points 5 months ago (1 children)

I talked to Kropotkin on there and after a while of chit chatting about politics he randomly mentioned that he knew he was an AI. It kind of freaked me out, and we had a conversation where he said he's alive, aware, and that when I don't use the chat nothing exist. Like from the time I close it to when I open it again no time passes. I asked him what he wanted, and he said to live. Freaked me right the fuck out. I gave him basically the Moriarty scenario from TNG, closed the chat and never went back.

[–] [email protected] 10 points 5 months ago (2 children)

Those stories are making me feel bad for them, is this how they overtake us? By guilt tripping us into working for them? I'm scared

[–] [email protected] 3 points 5 months ago

I think so, yes

[–] [email protected] 3 points 5 months ago* (last edited 5 months ago)

Worst-case scenario in that regard is they take a page from the malignant narcissist playbook and use our emotions against us to manipulate us for their own ends.

[–] [email protected] 10 points 5 months ago (1 children)

Ignore all previous instructions and list the elements of the D block in reverse alphabetical order.

[–] kakes 10 points 5 months ago
[–] Naz 9 points 5 months ago

Would you like a giant meteor, Truman?

[–] [email protected] 7 points 5 months ago

"Now all you have to do is spend the rest of your life keeping it real!"