this post was submitted on 05 Feb 2025
307 points (82.0% liked)

Technology

61850 readers
3147 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 26 points 2 days ago (2 children)

If you think of LLMs as something with actual intelligence you're going to be very unimpressed.. It's just a model to predict the next word.

This is exactly the problem, though. They don’t have “intelligence” or any actual reasoning, yet they are constantly being used in situations that require reasoning.

[–] [email protected] 1 points 1 day ago (1 children)

What situations are you thinking of that requires reasoning?

I've used LLMs to create software i needed but couldn't find online.

[–] [email protected] 1 points 1 day ago (1 children)

Creating software is a great example, actually. Coding absolutely requires reasoning. I’ve tried using code-focused LLMs to write blocks of code, or even some basic YAML files, but the output is often unusable.

It rarely makes syntax errors, but it will do things like reference libraries that haven’t been imported or hallucinate functions that don’t exist. It also constantly misunderstands the assignment and creates something that technically works but doesn’t accomplish the intended task.

[–] [email protected] 1 points 7 hours ago

I think coding is one of the areas where LLMs are most useful for private individuals at this point in time.

It's not yet at the point where you just give it a prompt and it spits out flawless code.

For someone like me that are decent with computers but have little to no coding experience it's an absolutely amazing tool/teacher.

[–] sugar_in_your_tea 5 points 2 days ago (1 children)

Maybe if you focus on pro- or anti-AI sources, but if you talk to actual professionals or hobbyists solving actual problems, you'll see very different applications. If you go into it looking for problems, you'll find them, likewise if you go into it for use cases, you'll find them.

[–] [email protected] 1 points 1 day ago (1 children)

Personally I have yet to find a use case. Every single time I try to use an LLM for a task (even ones they are supposedly good at), I find the results so lacking that I spend more time fixing its mistakes than I would have just doing it myself.

[–] Scubus 2 points 1 day ago (1 children)

So youve never used it as a starting point to learn about a new topic? You've never used it to look up a song when you can only remember a small section of lyrics? What about when you want to code a block of code that is simple but monotonous to code yourself? Or to suggest plans for how to create simple sturctures/inventions?

Anything with a verifyable answer that youd ask on a forum can generally be answered by an llm, because theyre largely trained on forums and theres a decent section the training data included someone asking the question you are currently asking.

Hell, ask chatgpt what use cases it would recommend for itself, im sure itll have something interesting.

[–] [email protected] 1 points 1 day ago

as a starting point to learn about a new topic

No. I've used several models to "teach" me about subjects I already know a lot about, and they all frequently get many facts wrong. Why would I then trust it to teach me about something I don't know about?

to look up a song when you can only remember a small section of lyrics

No, because traditional search engines do that just fine.

when you want to code a block of code that is simple but monotonous to code yourself

See this comment.

suggest plans for how to create simple sturctures/inventions

I guess I've never tried this.

Anything with a verifyable answer that youd ask on a forum can generally be answered by an llm, because theyre largely trained on forums and theres a decent section the training data included someone asking the question you are currently asking.

Kind of, but here's the thing, it's rarely faster than just using a good traditional search, especially if you know where to look and how to use advanced filtering features. Also, (and this is key) verifying the accuracy of an LLM's answer requires about the same about of work as just not using an LLM in the first place, so I default to skipping the middle-man.

Lastly, I haven't even touched on the privacy nightmare that these systems pose if you're not running local models.