Didn't kitboga do this like last year. I think his crypto maze is the answer
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
He decided that it was unethical to have an AI/LLM impersonate a real person, but set up the "wizard" as an AI assistant for his fake crypto site helpline.
I don't think that's unethical personally
Huzzah!
I got a real kick out of that one, well done, bravo! 👏
Nice. I think Daisy should also get a Youtube channel so we can listen to her, handling the telemarketers.
And I want that technology for my personal use as my own assistant/secretary.
Isn't this what Google is tryna do. I remember the demo from a few years back were it made a call to a restaurant, but that never really saw the roll out as far as I'm aware. The closest thing I have right now is their screening service where they leave a transcript
I wonder if that's still a thing, it's been announced years ago. I don't know anyone using the Google Assistant, and I removed that from my device. Have to ask some of my friends, or at the pizza place.
Assuming that the scammers use some form of LLM too, then we've got LLMs calling other LLMs. What a fucking waste. It's like an upscaled version of senders using LLMs to expand their emails and recipients using LLMs to summarize them.
Begun has the LLM Wars.
It's all fun and games until the scammers use AI themselves to massively scale their operations.
It’s been reported widely that it’s already happening. They use phone banks to scam, they use AI to scam. If it’s out there, it’s being used to scam.
It's my understanding that LLM's are thoroughly unsafe, always reporting everything it does and every input back to whoever made the LLM. So, wouldn't it be easy for whoever owns the LLM to see what it's being used for, and to refuse service to scammers?
There are on premise LLMs
Good chance it's probably happening already. Worst part is both eat so much power.
We need to set up a date with Daisy and Lenny
There was a much older version of this called The Jolly Roger Phone Company. The more the merrier!
I love this