PotatoKat

joined 1 year ago
[–] [email protected] 1 points 2 months ago

Here you go, simplified version of the meme that I'm 100% sure addresses all of your complaints /s

https://lemmy.world/pictrs/image/a611d025-3163-427d-bf32-3a6b37f716a3.png

[–] [email protected] 10 points 2 months ago (1 children)

In what way are both sides unhappy?

[–] [email protected] 1 points 2 months ago

Except her pre-transition fastest 1000 free was faster than the record for female 1000 free.

To add on to that. Her pre-transition time was ~24 seconds slower than the male record and post transition her 1000 free was about 32 seconds slower than the female record. So if anything she was preforming better in her categories before she transitioned.

[–] [email protected] 1 points 2 months ago

With Terrance Howard making 1x1=2 popular on Rogan they might make it there

[–] [email protected] 13 points 2 months ago (2 children)

Also the 4th movie shows that something is alive if it's considered a toy/played with. So Sid did consider his creations toys that he played with.

[–] [email protected] -1 points 2 months ago

Oh my god your mor pathetick then i thoght. Corecting grammer online and folloing my replys

[–] [email protected] -2 points 2 months ago (3 children)
[–] [email protected] 2 points 2 months ago (1 children)

Bloodborne 60fps

[–] [email protected] 1 points 2 months ago (1 children)

It's the next logical step for the pearl clutchers and amounts to "thought crime."

I seriously doubt they would create any more surveillance for that than there already is for real CSAM.

The geek squad worker could still report these people, and it would be the prosecution's job to prove that they were acquired or created in an illegal way.

That would just make it harder to prosecute people for CSAM since they will all claim their material was just ai. That would just end up helping child abusers get away with it.

Possession itself isn't the problem, the problem is how they're produced.

I think the production of generated CSAM is unethical because it still involves photos of children without their consent

No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.

There is evidence to suggest that viewing csam increases child seeking behavior. So them viewing generated CSAM would most likely have the same if not a similar result. That would mean that even just having access to the materials would increase the likelihood of child abuse

https://www.theguardian.com/global-development/2022/mar/01/online-sexual-abuse-viewers-contacting-children-directly-study

The survey was self reported so the reality is probably higher than the 42% cited from the study

I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children.

The best legal avenue for non-offending pedophiles to take is for them to find a psychologist that can help them work through their desires. Not to give them a thing that will make them want to offend even more.

[–] [email protected] 3 points 2 months ago (1 children)

I tried nobara with my lappy and it just did not work with my GPU (gtx960m). No matter what i tried and installed it just wouldn't work. I switched over to pop-os and it's been working like a charm since. So YMMV with whatever os you try so don't be afraid to switch it up to another if one isn't doing it for you

[–] [email protected] 6 points 2 months ago

Sadly it probably wont have Mick Gordon doing the music again after what happened with Eternal

view more: ‹ prev next ›