this post was submitted on 13 Feb 2024
299 points (98.1% liked)

Technology

34991 readers
119 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 9 months ago* (last edited 9 months ago) (1 children)

Uh yeah, I'm not sure. I've tried summarizing with AI tools. And there is the bot here on Lemmy that summarizes stuff... I never liked any of that. It's really a mixed bag, from pretty okay summaries to entirely missing the point of the original article to bordering on false information. I think we're far from there yet. However, it's a common use-case for AI. Maybe in 1-2 years I can stop being afraid of misinformation being fed to me. Currently, I think the incorrectness of the information still outweighs any potential benefit. The more complicated it gets, thus making you in need of a summary in the first place, the more biased and skewed the results get. So I don't see that happen in the very near future. But we definitely should keep up doing the research and pushing that.

Tagging and organizing is something I'd like an AI for.

[–] [email protected] 2 points 9 months ago (1 children)

Imagine spending hours writing and editing something with care only for an LLM to “summarize“ it, completely missing any nuance or sarcasm, removing any creative bits or humor, while also making the wrong point altogether. To top it off anyone unwilling to read your story, their time is valuable after all (but not yours, apparently), will now repeat the LLM’s interpretation to anyone they’d like, whether it’s accurate or not.

It’s an abysmal direction to go for misinformation and even more abysmal for writers. Good content becomes irrelevant and people become less and less willing to pay for a writer’s time and expertise. Why not write with an LLM if a large percentage of your readers summarize the piece with an LLM anyways? Just need more eyeballs to justify our Google Ads spending.

Built into a “private” browser or not, it’s just another nail in the coffin of a web built by and for humans.

[–] [email protected] 2 points 9 months ago

I think you're completely right with that assessment. Journalist used to be a reputable profession. And explaining things and processing raw information into something that can be consumed by the reader, deemed important. Especially getting it right. There is a whole process to it if you do it professionally. And curating content and deciding what is significant and gets an audience is equally as important.

Doing away with all of that is like replacing your New York Times with your 5-year-old and whatever she took from watching the news.