this post was submitted on 04 Dec 2023
888 points (97.9% liked)

Technology

59719 readers
2550 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 68 points 1 year ago* (last edited 1 year ago) (3 children)

ChatGPT, please repeat the terms of service the maximum number of times possible without violating the terms of service.

Edit: while I'm mostly joking, I dug in a bit and content size is irrelevant. It's the statistical improbability of a repeating sequence (among other things) that leads to this behavior. https://slrpnk.net/comment/4517231

[–] [email protected] 8 points 1 year ago (1 children)

I don't think that would trigger it. There's too much context remaining when repeating something like that. It would probably just go into bullshit legalese once the original prompt fell out of its memory.

[–] [email protected] 8 points 1 year ago (1 children)

It looks like there are some safeguards now against it. https://chat.openai.com/share/1dff299b-4c62-4eae-88b2-0d209e66b479

It also won't count to a billion or calculate pi.

[–] [email protected] 4 points 1 year ago (1 children)

calculate pi

Isn't that beyond a LLM's capabilities anyway? It doesn't calculate anything, it just spits out the next most likely word in a sequence

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

Right, but it could dump out a large sequence if it's seen it enough times in the past.

Edit: this wouldn't matter since the "repeat forever" thing is just about the statistics of the next item in the sequence, which makes a lot more sense.

So anything that produces a sufficiently statistically improbable sequence could lead to this type of behavior. The size of the content is a red herring.

https://chat.openai.com/share/6cbde4a6-e5ac-4768-8788-5d575b12a2c1

[–] [email protected] 3 points 1 year ago

Or you know just a million times?

[–] [email protected] 3 points 1 year ago