this post was submitted on 23 Dec 2024
13 points (100.0% liked)

TechTakes

1490 readers
34 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 2 days ago (1 children)

If it were merely a search engine, it risks not being ai enough. We already have search engines, and no one is gonna invest in that old garbage. So instead, it finds something that you might want that’s been predigested for ease of ai consumption (Retrieval), dumps it into the context window alongside your original question (Augmentation) and then bullshits about it (Generation).

Think of it as exactly the same stuff that the LLM folk have already tried to sell you, trying to work around limitations of training and data availability by providing “cut and paste as a service” to generate ever more complex prompts for you, in the hopes that this time you’ll pay more for it than it costs to run.

[–] [email protected] 6 points 1 day ago

This stuff is getting pushed all the time in Obsidian plugins (note taking/personal knowledge management software). That kind of drives me crazy because the whole appeal of the app is your notes are just plain text you could easily read in notepad, but some people are chunking up their notes into tiny, confusing bite-sized pieces so it's better formatted for a RAG (wow, that sounds familiar)

Even without a RAG, using LLMs for searching is sketchy. I was digging through a lot of obscure Stack Overflow posts yesterday and was thinking, how could an LLM possibly help with this? It takes less than a second to type in the search terms and you just have to look at the titles and snippets of the results to tell if you're on the right track. You have the exact same bottleneck of typing and reading, except with ChatGPT or Copilot you also have to pad your query with a bunch of filler and read all the filler slop in the answer as it streams in a couple thousand times slower than dial-up. Maybe they're more equal with simpler questions you don't have to interrogate, but then why even bother? I've seen some people who say ChatGPT is faster, easier, and more accurate than Stack Overflow and even two crazy ones who said it's completely obsolete and trying to understand that perspective just causes me psychic damage.