this post was submitted on 19 May 2025
1568 points (98.0% liked)

Microblog Memes

8185 readers
3162 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
(page 4) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 1 month ago

Well, this just looks like criteria for a financially sucessful person.

[–] [email protected] 0 points 1 month ago* (last edited 1 month ago)

Dumb take because inaccuracies and lies are not unique to LLMs.

half of what you’ll learn in medical school will be shown to be either dead wrong or out of date within five years of your graduation.

https://retractionwatch.com/2011/07/11/so-how-often-does-medical-consensus-turn-out-to-be-wrong/ and that's 2011, it's even worse now.

Real studying is knowning that no source is perfect but being able to craft a true picture of the world using the most efficient tools at hand and like it or not, objectively LLMs are pretty good already.

[–] [email protected] 0 points 1 month ago

In terms of grade school, essay and projects were of marginal or nil educational value and they won't be missed.

Until the last 20 years, 100% of the grade for medicine was by exams.

[–] [email protected] 0 points 1 month ago* (last edited 1 month ago) (2 children)

This is fair if you're just copy-pasting answers, but what if you use the AI to teach yourself concepts and learn things? There are plenty of ways to avoid hallucinations, data-poisoning and obtain scientifically accurate information from LLMs. Should that be off the table as well?

load more comments (2 replies)
load more comments
view more: ‹ prev next ›