this post was submitted on 30 Jan 2024
505 points (93.6% liked)

Technology

59622 readers
2758 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 208 points 10 months ago (5 children)

If you paste plaintext passwords into ChatGPT, the problem is not ChatGPT; the problem is you.

[–] [email protected] 65 points 10 months ago (3 children)

Well tbf chatGPT also shouldn't remember and then leak those passwords lol.

[–] [email protected] 59 points 10 months ago (2 children)

Did you read the article? It didn't. Someone received someone else's chat history appended to one of their own chats. No prompting, just appeared overnight.

[–] [email protected] 50 points 10 months ago

Well, that's even worse.

[–] [email protected] 35 points 10 months ago (1 children)

........ That shouldnt be happening, regardless of chat content

[–] [email protected] 9 points 10 months ago (1 children)

Well, yeah, but the point is, ChatGPT didn't "remember and then leak" anything, the web service exposed people's chat history.

[–] [email protected] 2 points 10 months ago

Well, that depends. Do you mean gpt the specific chunk of lln code? Or do you mean gpt the website and service?

Because while the nitpicking details matter to the programmers fixing it, how much does that distinction matter to you or I, the laymen using the site?

[–] [email protected] 12 points 10 months ago (2 children)

How ? How it should be implemented? It's just a llm. It has no true intelligence.

[–] [email protected] 7 points 10 months ago

If it's not trained on user data it cannot leak it

[–] [email protected] 1 points 9 months ago (1 children)
[–] [email protected] 1 points 9 months ago

Able to have a reflection.

[–] [email protected] 7 points 10 months ago (2 children)

A huge value add of.chatgpt is that you can have running, contextual conversation. That requires memory.

[–] [email protected] 6 points 10 months ago (1 children)

All of these LLMs should have walls between individual users, though, so that the chat history of one user is never accessible to any other user. Applying some kind of restriction to the LLM training and how chats are used is a conversation we can have, but the article and the example given is a much, much simpler problem that a user checking his own chat history was able to see other user's chats.

[–] [email protected] 2 points 10 months ago
[–] [email protected] 5 points 10 months ago* (last edited 10 months ago) (1 children)

It doesn't actually have memory in that sense. It can only remember things that are in the training data and within its limited context (4-32k tokens, depending on model). But when you send a message, ChatGPT does a semantic search of everything in the conversation and tries to fit the relevant parts inside the context, if there's room.

[–] [email protected] 6 points 10 months ago* (last edited 10 months ago)

I'm familiar, it's just easiest for the layman to consider the model having "memory" as historical search is a lot like it at arm's length

[–] [email protected] 26 points 10 months ago (2 children)

Hey chatGPT, is hunter2 a good password?

[–] [email protected] 4 points 10 months ago

I'm sorry, but as an AI language model, I cannot tell you about the effectiveness of "*******" as a password.

[–] [email protected] 1 points 10 months ago

It's an old meme, but it checks out.

[–] [email protected] 5 points 10 months ago

Shit. Guess I gotta stop using "Bosco".