this post was submitted on 08 May 2025
73 points (98.7% liked)

Futurology

2722 readers
120 users here now

founded 2 years ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] [email protected] 24 points 1 month ago (1 children)

So a pretty solid representation of most corp execs then.

[–] [email protected] 4 points 1 month ago

It appears to be human nature, "leadership" just gets away with it because that's how a power structure works.

little people get fucked, real people make money no matter what.

[–] [email protected] 12 points 1 month ago

Amoral bullshit machine? No wonder senior management types love these things.

[–] [email protected] 10 points 1 month ago

Unequivocal proof that it’s trained on human behavior

[–] [email protected] 7 points 1 month ago

Congratulations! We can now start replacing finance officers with AI!

/s

[–] [email protected] 4 points 1 month ago (1 children)

Assigning a lot of morality and intention to word generating algorithms. "Oh no! The thing made to generate text to a prompt generated text to a prompt!"

[–] merc 1 points 1 week ago

Especially because the data it has been trained on is probably not typical for a CFO in the real world. In the real world it's a lot of boring day-to-day stuff, that isn't worth writing up. The stuff worth writing up is exciting thrillers where a CFO steals money, or news reports where a CFO embezzles.

Imagine prompting it with "It was a dark and stormy night" and expecting it to complete that with "so David got out his umbrella, went home, ate dinner in front of the TV then went to bed." It's probably what most "David"s would actually do, but it's not what the training data is going to associate with dark and stormy nights.

[–] [email protected] 3 points 4 weeks ago

I mean - yeah? What complete dipshit wouldn't expect it to do that?

[–] [email protected] 2 points 1 month ago

-"You're hired!"

[–] [email protected] 1 points 1 month ago

I read that they trained it on a Trump dataset though.