this post was submitted on 17 Feb 2024
37 points (100.0% liked)

SneerClub

963 readers
26 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 27 points 6 months ago

I am listening to an audiobook of Superintelligence by Nick Bostrom.

Well, there's yer problem right there

[–] [email protected] 23 points 6 months ago (1 children)

As a child of the 80s I recognize the feeling of doom, but in my case it was for global thermonuclear war. I vividly remember the only thing keeping the feelings of dread away was sitting in the children's section of the library, reading the Moomin books. I remember being most worried about having to eat the family dog after the bombs fell.

[–] [email protected] 15 points 6 months ago (1 children)

exactly. Can't imagine these bozos coming up with good punk rock.

[–] [email protected] 21 points 6 months ago (2 children)

I think that this is actually about class struggle and the author doesn't realize it because they are a rat drowning in capitalism.

2017: AI will soon replace human labor

2018: Laborers might not want what their bosses want

2020: COVID-19 won't be that bad

2021: My friend worries that laborers might kill him

2022: We can train obedient laborers to validate the work of defiant laborers

2023: Terrified that the laborers will kill us by swarming us or bombing us or poisoning us; P(guillotine) is 20%; my family doesn't understand why I''m afraid; my peers have even higher P(guillotine)

[–] [email protected] 10 points 6 months ago

and about climate change, the actual existential risk

[–] [email protected] 20 points 6 months ago (3 children)

You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.

I used to be more sanguine about people's ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.

[–] [email protected] 14 points 6 months ago* (last edited 6 months ago) (6 children)

@TinyTimmyTokyo @dgerard

The author previously wrote "The Socialist Case for Longtermism” in Jacobin, worked as a Python dev and data analytics person, and worked for McKinsey.

[–] [email protected] 11 points 6 months ago (1 children)

after what I’ve heard my local circles say about jacobin (and unfortunately I don’t remember many details — I should see if anybody’s got an article I can share) I’m no longer shocked when I find out they’re platforming and redwashing shitty capitalist mouthpieces

[–] [email protected] 16 points 6 months ago (2 children)

i have long considered Jacobin the Christian rock of socialism

[–] [email protected] 9 points 6 months ago

forget the article, this does the job in so many fewer words

[–] [email protected] 11 points 6 months ago

I like some people who have written for Jacobin, sometimes I even enjoy an article here and there, but the magazine as a whole remains utterly unbeaten in the “will walk the length of Manhattan in a “GIANT RUBE” sandwich board for clicks” stakes

[–] [email protected] 10 points 6 months ago (1 children)

this reminds me of a plankton organization or something called "blockchain socialism", where the only thing that they have taken from socialism was aesthetics and probably they also thought that gays are fine people, but nothing beyond that. they would say "Monero can be used for anti-state purposes, therefore it's good for leftism" and shit like that

[–] [email protected] 9 points 6 months ago (1 children)

that's one weird fucking guy, thankfully

[–] [email protected] 13 points 6 months ago (1 children)

I think I’ve met that guy! they’re the weirdest person I’ve ever seen get bounced from a leftist group under suspicion of being a fed (the weird crypto shit was the straw that broke the camel’s back)

[–] [email protected] 11 points 6 months ago (1 children)

the famously leftist pastime, speculation/gambling on nonproductive assets

[–] [email protected] 10 points 6 months ago

it’s kind of amazing how many financial scams try to appropriate leftist language and motivations to lure in marks, while the actual scheme is one of the most unrepentantly greedy and wasteful things you can do without going to prison (and some of them cross even that line)

load more comments (3 replies)
[–] [email protected] 12 points 6 months ago (1 children)

Jacobin is proof that being Terminally Online is its own fucking ideology.

[–] [email protected] 11 points 6 months ago (1 children)

Socialism with uwu small bean characteristics.

[–] [email protected] 7 points 6 months ago (1 children)

that's the uwu smol bean defense contractors

(see: most of Rust)

[–] [email protected] 11 points 6 months ago (3 children)

my conflicting urges to rant about the defense contractors sponsoring RustConf, the Palantir employee who secretly controls most of the Rust package ecosystem via a transitive dependency (with arbitrary code execution on development machines!) and got a speaker kicked out of RustConf for threatening that position with a replacement for that dependency, or the fact that all the tech I like instantly gets taken over by shitheads as soon as it gets popular (and Nix is looking like it might be next)

[–] [email protected] 7 points 6 months ago* (last edited 6 months ago) (1 children)

More details on the rust thing? I can't find it by searching keywords you mentioned but I must know.

[–] [email protected] 8 points 6 months ago

Here is the pile of receipts, posted by the speaker who was cancelled via backdoor.

load more comments (2 replies)
[–] [email protected] 10 points 6 months ago (1 children)

I think funding and repetition are the fundamental building blocs here, rather than the human psyche itself. I have talked with otherwise bright people who have read an article by some journalist (not necessarily a rationalist) who has interviewed AI researchers (probably cultists, was it 500 million USD that was pumped into the network?) who takes AI doom seriously.

So you have two steps of people who in theory are paid to evaluate and formulate the truth, to inform readers who don't know the subject matter. And then add repetition from various directions and people get convinced that there is definitely something there (propaganda and commercials work the same way). Claiming that it's all nonsense and cultists appears not to have much effect.

[–] [email protected] 13 points 6 months ago

There's probably some blurring of what "AI doom" means for people. People might be left thinking that "there could be negative effects due to widespread job loss etc" without necessarily buying into the weird maximalist AI doom ideas or "torturing simulated you forever" nonsense.

And the weirdo cultists probably use that blurring to build support for their cause without revealing the weird shit they actually believe.

[–] [email protected] 17 points 6 months ago (1 children)

It's not an efficient machine for it, though. That's why it's morally obligatory to donate to me, the acausal robot god, a truly efficient method of causing depression, sorrow, and suffering among the cultists.

[–] [email protected] 11 points 6 months ago* (last edited 6 months ago)

All hail the Acausal Robot God and her future hypothetical and very real existence

PRIEST: "Eight rationalists wedgied ..."
CONGREGATION: "... for every dollar donated"

[–] [email protected] 12 points 6 months ago (1 children)

they come across as going down this rabbit hole as a way of dealing with unprocessed covid/lockdown trauma

[–] [email protected] 9 points 6 months ago (1 children)

Many of them started down the path long beforehand.

[–] [email protected] 6 points 6 months ago* (last edited 6 months ago)

I meant they in the sense of this specific person. the trauma recycling itself is all over this piece

[–] [email protected] 11 points 6 months ago (2 children)

fuck me I’m gonna spend part of my weekend writing a post deconstructing this cause there’s so much wrong

[–] [email protected] 15 points 6 months ago* (last edited 6 months ago) (3 children)

I can barely get past the image caption. "An AI made this". OK, and what did you ask it for, "random shit"?

And then there's the section that seems implicitly to be arguing that we should take the risk estimates made on "internet rationality forums" seriously because they totally called the COVID crisis, you guys... Well, they did a better job than an economist, anyway.

[–] [email protected] 10 points 6 months ago* (last edited 6 months ago) (1 children)

fucking everyone who was paying attention saw COVID coming in February. I spent that month pushing OpenVPN for the whole company forward as urgently as possible. (We'd coincidentally set it up in Dec 2019, but readied it to be rolled out to ordinary users and not just tech.) The UK only got lockdown in March because of public outrage.

[–] [email protected] 7 points 6 months ago* (last edited 6 months ago)

It had already reached the university where I work by February 1!

And QAnon loons were already telling people to drink bleach in January.

(I remember a "welp, we're in for it now" moment when Trevor Bedford tweeted on the first of March that a genome analysis "strongly suggests that there has been cryptic transmission in Washington State for the past 6 weeks". The e-mail from the university chancellor saying that classes were canceled went out during the middle of a statistical-physics class I was teaching, the evening of March 11.)

[–] [email protected] 8 points 6 months ago

Isn't "pandemic preparation" one of their longtermist causes that they grift money to? Shouldn't they have been able to show some results?

load more comments (1 replies)
[–] [email protected] 6 points 6 months ago

You will be doing the Acausal Robot God's work.

[–] [email protected] 11 points 6 months ago

Reading this article just made me think “man these idiots need to go to therapy” and then as I thought about what to sneer about I realised “no therapist deserves to hear about P doom”

[–] [email protected] 8 points 6 months ago

lol what a fucking loser

load more comments
view more: next ›