this post was submitted on 20 Jul 2023
355 points (97.6% liked)

Programmer Humor

31793 readers
27 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
top 20 comments
sorted by: hot top controversial new old
[–] tiny_electron 46 points 1 year ago (2 children)

It is like compression... but backwards ^^

[–] [email protected] 33 points 1 year ago
[–] [email protected] 12 points 1 year ago (2 children)

now that you mention it...

the real question is how long before we just have automated agents sending corporate emails to one another without any human in the loop 🤣

[–] [email protected] 13 points 1 year ago

Its 100% already happening.

[–] [email protected] 5 points 1 year ago

Minus 10 years.

[–] [email protected] 24 points 1 year ago (2 children)

Most corporate communications are unnecessarily fluffy to begin with because it makes it look like more work was done. Most of the time I don't even understand why I'm explaining something and it feels like the only requirement is to have words on a page.

[–] [email protected] 11 points 1 year ago (1 children)

Sometimes the only requirement IS to have words on a page. Think about a disaster recovery plan, for example. Now, you probably don't want an LLM to write your disaster recovery plan, but it's a perfect example of something where the main value is that you wrote it down, and now you can be certified that you have one.

[–] [email protected] 6 points 1 year ago (1 children)

I just asked GPT to create a disaster recovery plan for a ransomware attack, and actually the information it gave wasn't wrong or bad. But it's also very generic, and it will rarely/never tell you correctly the specifics to your applications or where to click.

[–] [email protected] 2 points 1 year ago

Right. Again, though, I don't recommend having an LLM do that particular chore for you.

[–] [email protected] 0 points 1 year ago

there's a whole book on the subject of bullshit jobs incidentally https://en.wikipedia.org/wiki/Bullshit_Jobs

[–] [email protected] 9 points 1 year ago (1 children)

beware! soon, it will be able to turn that long email into a meeting!

[–] [email protected] 3 points 1 year ago (1 children)

And another GPT will participate in it for me. Good.

[–] [email protected] 5 points 1 year ago

"Didja hear, Jeff had a heart attack."

"Wait... Jeff was a real person this entire time?"

[–] [email protected] 6 points 1 year ago

Something is wrong, why do AIs get to spend all their time writing and painting while we have to go to work every day?

[–] [email protected] 6 points 1 year ago (1 children)

This is a legitimate use case for LLM, though.

Not everyone can communicate clearly. Not everyone can summarize well. So the panel on the right is great for the people on the other end, who must read your poorly-communicated thoughts.

At the same time, some things must look like you put careful thought and time into your words. Hence, the panel on the left.

And if people on both sides are using the tool to do this, who's really hurt by that?

[–] [email protected] 9 points 1 year ago (1 children)

Yes, but there is a real risk here that either the expansion added false details or the summary is wrong, especially the summary.

[–] [email protected] 1 points 1 year ago (1 children)

I don't disagree, but most business emails aren't quite that strict.

[–] [email protected] 2 points 1 year ago

It's not about formality. It is about the introduction of error. Less strict communication is more likely to have such errors introduced.

[–] [email protected] 3 points 1 year ago (1 children)

The AI arms race has begun!

Isn't this kinda thing happening already in the recruitment industry?

[–] [email protected] 1 points 1 year ago

pretty sure stuff like resume screening is done using machine learning nowadays