531
submitted 2 months ago by [email protected] to c/[email protected]
top 50 comments
sorted by: hot top controversial new old
[-] DudeImMacGyver 70 points 2 months ago
[-] [email protected] 28 points 2 months ago* (last edited 2 months ago)

I just tried to have Gemini navigate to the nearest Starbucks and the POS found one 8hrs and 38mins away.

Absolute trash.

[-] [email protected] 18 points 2 months ago

Just tried it with Target and again, it's sending me to Raleigh, North Carolina.

[-] [email protected] 12 points 2 months ago

It seems to think you need to leave Alabama but aren’t ready for a state as tolerable as Georgia

[-] [email protected] 7 points 2 months ago

I would totally leave if the "salary to cost of living" ratio wasn't so damn good.

I'd move to Germany or the Netherlands or Sweden or Norway so fast if I could afford it.

[-] Randomocity 5 points 2 months ago

that leads me to believe it thinks you are in North Carolina. have you allowed location to Gemini? Are you on a VPN?

[-] [email protected] 5 points 2 months ago

No VPN, it all has proper location access. I even tried it with a local restaurant that I didn't think was a chain, and it found one in Tennessee. I'm like 10 minutes away from where I told it to go.

[-] [email protected] 20 points 2 months ago

Despite that, it delivers its results with much applum!

[-] DudeImMacGyver 4 points 2 months ago

Quality pum

[-] [email protected] 63 points 2 months ago

Some "AI" LLMs resort to light hallucinations. And then ones like this straight-up gaslight you!

[-] [email protected] 50 points 2 months ago

Factual accuracy in LLMs is "an area of active research", i.e. they haven't the foggiest how to make them stop spouting nonsense.

[-] [email protected] 28 points 2 months ago

duckduckgo figured this out quite a while ago: just fucking summarize wikipedia articles and link to the precise section it lifted text from

[-] [email protected] 9 points 2 months ago

We can't fleece investors with that though, needs more "AI".

[-] [email protected] 12 points 2 months ago* (last edited 2 months ago)

Because accuracy requires that you make a reasonable distinction between truth and fiction, and that requires context, meaning, understanding. Hell, full humans aren't that great at this task. This isn't a small problem, I don't think you solve it without creating AGI.

[-] brbposting 9 points 2 months ago

Ha!

You buy this? (Believe it’s incredibly expensive)

[-] [email protected] 7 points 2 months ago

MFer accidentally got "plum" right and didn't even know it...

[-] [email protected] 42 points 2 months ago
[-] [email protected] 22 points 2 months ago
[-] [email protected] 9 points 2 months ago

I barely know 'em!

[-] [email protected] 39 points 2 months ago

Ok, let me try listing words that ends in "um" that could be (even tangentially) considered food.

  • Plum
  • Gum
  • Chum
  • Rum
  • Alum
  • Rum, again
  • Sea People

I think that's all of them.

[-] [email protected] 51 points 2 months ago
[-] [email protected] 12 points 2 months ago

Happy belated Mother's Day?

[-] [email protected] 6 points 2 months ago

The Sea Peoples consumed by the Late Bronze Age collapse (or were a catalysts thereof)?

Or just people at sea eaten by krakens? Cause they definitely count.

[-] [email protected] 5 points 2 months ago

It's a dirty joke.

load more comments (1 replies)
[-] [email protected] 29 points 2 months ago

Totally reproducible, just with slightly different prompts.

[-] [email protected] 27 points 2 months ago

There's going to be an entire generation of people growing up with this and "learning" this way. It's like every tech company got together and agreed to kill any chance of smart kids.

[-] [email protected] 11 points 2 months ago

Isn't it the opposite? Kids see so many examples of obviously wrong answers they learn to check everything

[-] [email protected] 6 points 2 months ago

How do they know something is obviously wrong when they try to learn it? For "bananum" sure, for anything at school, college though?

load more comments (1 replies)
[-] [email protected] 6 points 2 months ago

One can hope...

load more comments (1 replies)
[-] [email protected] 28 points 2 months ago* (last edited 2 months ago)

And yet it doesn’t even list ‘Plum’, or did it think ‘Applum’ was just a variation of a plum?

[-] [email protected] 9 points 2 months ago

Well, plum originally comes from applum which morphed into a plum so yeah.

And that's absolutely not true.

[-] [email protected] 3 points 2 months ago

I will start using this as a factoid

load more comments (3 replies)
[-] [email protected] 17 points 2 months ago

Strawberrum sounds like it'll be at least 20% abv. I'd like a nice cold glass of that.

[-] [email protected] 14 points 2 months ago

Strawberrum? Barely knew 'em!

[-] [email protected] 13 points 2 months ago

Gemini thought we name food like we name a periodic table

[-] [email protected] 9 points 2 months ago
[-] [email protected] 12 points 2 months ago

A gram of plutonium has enough calories to last you the rest of your life.

[-] [email protected] 12 points 2 months ago

Applum bananum jeans, boots with the fur.

[-] [email protected] 11 points 2 months ago

It's crazy how bad d AI gets of you make it list names ending with a certain pattern. I wonder why that is.

[-] [email protected] 11 points 2 months ago

I'm not an expert, but it has something to do with full words vs partial words. It also can't play wordle because it doesn't have a proper concept of individual letters in that way, its trained to only handle full words

load more comments (4 replies)
[-] [email protected] 4 points 2 months ago

It can't see what tokens it puts out, you would need additional passes on the output for it to get it right. It's computationally expensive, so I'm pretty sure that didn't happen here.

load more comments (2 replies)
[-] [email protected] 9 points 2 months ago

AI is truly going to change the world.

[-] [email protected] 7 points 2 months ago

30 years from now: "Haven't they always been called Strawberrums?"

[-] [email protected] 4 points 2 months ago

We've always been at war with Bananum!

[-] [email protected] 6 points 2 months ago

Looks like someone set Google to "Herakles Mode".

[-] [email protected] 5 points 2 months ago

Tomatum.. that's the one

[-] [email protected] 4 points 2 months ago

Ok, I feel like there has been more than enough articles to explain that these things don't understand logic. Seriously. Misunderstanding their capabilities at this point is getting old. It's time to start making stupid painful.

[-] [email protected] 3 points 2 months ago

I cannot reproduce this on Google.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 13 May 2024
531 points (98.5% liked)

solarpunk memes

2283 readers
11 users here now

For when you need a laugh!

The definition of a "meme" here is intentionally pretty loose. Images, screenshots, and the like are welcome!

But, keep it lighthearted and/or within our server's ideals.

Posts and comments that are hateful, trolling, inciting, and/or overly negative will be removed at the moderators' discretion.

Please follow all slrpnk.net rules and community guidelines

Have fun!

founded 2 years ago
MODERATORS