this post was submitted on 27 Jul 2023
1317 points (97.3% liked)

People Twitter

4864 readers
2527 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying.
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] PrincessLeiasCat 206 points 1 year ago (3 children)

While the media posted by the influencer has been removed, numerous text interactions with the deleted posts from his followers are still on the platform. Some of those posts mention a child depicted in the photos as young as one and a half years old.

Source

To make matters worse, the image appears to have been on the platform for several days before being removed. Lucre even described the image in detail in a separate tweet, noting that it had been taken from a video. The video in question involved the abuse of three children, one of whom was reportedly strangled to death after the filming.

Source

The CSAM material was left up for FOUR DAYS (July 22-26) before he was even suspended. Then they let him "delete" it...and reinstated him. People commented during those 4 days DESCRIBING THE IMAGES.

What the FUCK please tell me this is worth a visit from the FBI, getting removed from the App Store, some massive GDPR violation, fucking something. How is this story not bigger news?

To put this in perspective, 4 graphic CSAM images that an account with 500,000 followers posted were left up on Twitter for 4 days. The person who posted the images was suspended for less than a day.

[–] [email protected] 94 points 1 year ago (2 children)

Content warning: I deliberately avoid providing much more detail than "it was clearly CSAM" but I do mention the overall tweet contents and pretext.

I remember this from when it happened and unfortunately did see the text portion and thumbnail from the original tweet.

He did it under the pretext of reporting on the arrest of a person involved in the video and large-scale CSAM production. It started as a standard news-report-style where they list the name, age and arrest details of someone taken into custody. Initially it looked like the normal alt-right tweet about "look at how paedophilia is rampant and the world is sinful!".

The guy describes himself as "chief trumpster" and a "breaker of narratives" and journalist. He claimed the details of the CSAM were provided by the Dutch police. He then described the title and detailed events of a CSAM video in the tweet. Unfortunately for me, the detailed events were below the tweet fold, so I had no idea it was going there until I expanded it.

The tweet image attachment or link unfurl thumb had a frame from the video itself. It was an otherwise-SFW image the adult abuser who was being talked about. Unfortunately I didn't realise until after I had expanded the tweet text contents what the thumbnail was. I actually thought it was an opengraph error at first.

Even in the context of "reporting shocking content" the tweet was way over the line and went from 0 to 100 in a few words. I did not need the info on the CSAM, nobody except the police and courts does. The video title alone was over the line.

Musk phrasing this as another "I was told" decision is just him knowingly deferring responsibility.

[–] PrincessLeiasCat 21 points 1 year ago (1 children)

Thank you for the additional context. I’m sorry you had to be exposed to that.

[–] [email protected] 21 points 1 year ago

Thanks, it was partly my fault for taking a curiosity tour of Twitter to see whether there was a noticeable right-wing shift from a few months earlier.

Hopefully I can at least prevent someone from having to go try to find the context for themselves and finding the full tweet, because it was awful even as text.

I assumed there would be "they took it out of context!" apologia when it was inevitably reported on, but the actual context didn't improve anything or abdicate anyone from responsibility.

My heart goes out to the human moderators at twitter who had to see more of it and didn't have the choice to bail before learning more. And obviously also to the victims of one of the most heinous acts I've ever heard about.

[–] [email protected] 8 points 1 year ago* (last edited 1 year ago) (1 children)

Thank for this. Seeing this explained this whole thing makes much more sense. Not in a good way, but I now understand. Edit: fixed typo

[–] [email protected] 7 points 1 year ago (1 children)

For what it's worth, I had no idea if it was the absolute worst journalistic judgement I have ever seen, or a way for him to find more CSAM, or some bizarre combo. That is something for the FBI to find out. But I do know the decision to unban him is beyond wild, even for someone trying to bankrupt a social media company. Text almost never makes me physically recoil.

[–] [email protected] 1 points 1 year ago

Yeah, even if we give them the benefit of the doubt that it was a bad attempt at journalism, there's no reason to defend them. Also bizarre that Elon would even get involved in it.

[–] ikapoz 22 points 1 year ago (2 children)

Do I want to know what CSAM stands for?

[–] [email protected] 34 points 1 year ago (2 children)

Child sexual assault material, unfortunately.

[–] ikapoz 20 points 1 year ago (1 children)

Yep. Didn’t want to know that.

[–] [email protected] 35 points 1 year ago (1 children)

Unfortunately, it’s a term everyone should know. It “replaces” the label child porn, because while it’s universally known as horrible, it’s not “porn.” It’s evidence of child sexual abuse. Hence “child sexual abuse material.”

[–] [email protected] 9 points 1 year ago

Right. Porn implies acting and consent.

[–] [email protected] -1 points 1 year ago

Is it pernounced KaZAM! ?