mii

joined 10 months ago
[–] [email protected] 9 points 8 months ago (3 children)

A friend saw this on LinkedIn.

[–] [email protected] 6 points 8 months ago* (last edited 8 months ago) (1 children)

In case you didn’t know, Sam “the Man” Altman is deadass the coolest motherfucker around. With world leaders on speed dial and balls of steel, he’s here to kick ass and drink milkshakes.

Within a day of his ousting, Altman said he received 10 to 20 texts from presidents and prime ministers around the world. At the time, it felt "very normal," and he responded to the messages and thanked the leaders without feeling fazed.

Archive link.

[–] [email protected] 6 points 8 months ago* (last edited 8 months ago)

nice. I've got one of the others after I found it a while back, but it looks like yours has a narrower scope?

Yeah, that one actually includes mine and covers more bases. When I started my list, I think they only had a uBlock-based blocker, which was too aggressive for me (and does not work properly on mobile), and there were many small uBlacklist lists which I just combined into one.

I think these days most of them are similar anyway.

[–] [email protected] 16 points 8 months ago* (last edited 2 months ago) (3 children)

It’s already too late for a lot of places, imo. DeviantArt for example is overrun by LLM-generated sludge and no amount of cleanup will undo that; and that site has been a staple of amateur and upcoming artists for decades. The same seems to be happening to Pixiv (which is big in Japan), too. Search engines are also full of generated SEO spam and it’s getting worse, with image search being close to useless unless you do implement some sort of blocklist. Which, for that use case, luckily already exist and aren’t bad (shameless self-plug), but it’s still a manual step you have to take and won’t help my grandma who’s looking for cookie recipes.

The silver lining might be that a growing number of people are willing to try decentralized solutions. I’ve seen more non-techies come over to Lemmy, Mastodon and Misskey as a result, but it’s still sad to see, especially because this will ultimately lead to tons of older content becoming either lost or needles in a shitstack you can’t ever hope to recover.

[–] [email protected] 11 points 8 months ago (1 children)

He’s either trying to generate new critihype by making Clippy intelligent again (“It learns just like those pesky hoomans do!”), or slither his way out of that lawsuit by claiming it couldn’t have stolen original ideas when there have never been any original ideas in the first place.

I’m still trying to figure out what’s stupider.

[–] [email protected] 18 points 8 months ago* (last edited 8 months ago) (8 children)

Creativity is a lie. You heard it here first.

[–] [email protected] 14 points 8 months ago (1 children)

I know this is like super low-hanging fruit, but Reddit’s singularity forum (AGI hype-optimists on crack) discuss the current chapter in the OpenAI telenovela and can’t decide whether Ilya and Jan Leike leaving is good, because no more lobotomizing the Basilisk, or bad, because no more lobotomizing the Basilisk.

Yep, there’s no scenario here where OpenAI is doing the right thing, if they thought they were the only ones who could save us they wouldn’t dismantle their alignment team, if AI is dangerous, they’re killing us all, if it’s not, they’re just greedy and/or trying to conquer the earth.

vs.

to be honest the whole concept of alignment sounds so fucked up. basically playing god but to create a being that is your lobotomized slave…. I just dont see how it can end well

Of course, we also have the Kurzweil fanboys chiming in:

Our only hope is that we become AGI ourselves. Use the tech to upgrade ourselves.

But don’t worry, there are silent voices of reasons in the comments, too:

Honestly feel like these clowns fabricate the drama in order to over hype themselves

Gee, maybe …

no ,,, they’re understating the drama in order to seem rational & worthy of investment ,, they’re serious that the world is ending ,, unfortunately they think they have more time than they do so they’re not helping very much really

Yeah, never mind. I think I might need to lobotomize myself now after reading that thread.

[–] [email protected] 15 points 8 months ago

Is increasing the amount of long-term health risks code for showering even less?

[–] [email protected] 17 points 8 months ago (1 children)

And we can do all of that by just scaling up autocomplete which is basically already AGI (if you squint).

How come the goal posts for AGI are always the best of what people can do?

I can't diagnose anyone, yet I have GI.

But it shouldn't surprise me that their benchmark of intelligence is basically that something can put together somewhat coherent sounding technobabble while being unable to do something my five year-old kindergartner can.

Yup, basically AGI.

[–] [email protected] 14 points 8 months ago* (last edited 8 months ago) (1 children)

Don't forget about how the presentations for Google Gemini and OpenAI Sora and Devin were all ~~faked~~ embellished too, but no one talks about that anymore either.

I'm still waiting for even one actual use-case for sexy Clippy that isn't generating SEO spam while speedrunning climate change.

[–] [email protected] 7 points 9 months ago (1 children)

I swear to god, starting a nature vs. nurture debate in a place leaning even slightly libertarian just breaks my brain because I never understand what point they’re even trying to make.

Half of them seem to argue that this is proof rich people stay rich and poor people stay poor (although I fail to see from which side they’re coming) while the other half uses it as thin-veiled excuses to be racist without mentioning race.

[–] [email protected] 10 points 9 months ago* (last edited 9 months ago) (1 children)

Oh look, another company’s “we won’t data scrape against our users’ wishes to feed spicy autocomplete” mask has just crumbled. I am so surprised.

Let’s see how many of my comments I can delete or overwrite with garbage before they ban me.

view more: ‹ prev next ›