ChunkMcHorkle

joined 1 year ago
[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

That’s not how this works.

So you can't name one either.

If you’re running an instance, it’s your responsibility to find an appropriate tool.

That's exactly what the admin here did, and what's more, he did so because he was forced to do so by a lack of "appropriate" or even adequate tools.

Hence the straightforward question you failed to answer.

I don't run an instance. CSAM is but one of the many reasons why. But I have been paying attention to the discussions regarding the flood of it here, and the impossibilities involved in starting from scratch with preventing/blocking this on a federated instance.

But for reasons I cannot begin to fathom, and with an intense interest in seeing this anti-CSAM tool remain unused, you are blithely sailing by all that with a demand for using a tool you personally could not even name and obviously does not exist in acceptable form, or it would already have been gladly implemented.

Glad he's ignoring you and carrying on. I think I'll do the same.

[–] [email protected] 4 points 1 year ago (5 children)

For abuse detection, you need to use a service that has been vetted by an actual lawyer.

Name one. That exists and already works on Fediverse instances.

[–] [email protected] 3 points 1 year ago

The crime happened in the past when the children were abused.

That's true. You could look at it that way and stop right there and remain absolutely correct. Or, you could also look at it from the eventual viewpoint of that victim as a human being: as long as that picture exists, they are being victimized by every new use of it, even if the act itself was done decades ago.

Not trying to pile on, but anyone who has suffered that kind of violation as a child suffers for life to some extent. There are many who kill themselves, and even more that cannot escape addiction because the addiction is the only safe mental haven they have where life itself is bearable. Even more have PTSD and other mental difficulties that are beyond understanding for those who have not had their childhood development shattered by that, or worse, had that kind of abuse be a regular occurrence for them growing up.

So to me, adding a visual record of that original violating act to the public domain that anyone can find and use for sick pleasure is an extension of the original violation and not very different from it, IMO.

The visual records are kind of a sick gift that never stop giving, and worse still if the victim knows the pics or videos are out there somewhere.

I am well aware not everyone sees it this way, but an extra bit of understanding for the victims would not go amiss. Imagine being an adult and browsing the web, thinking it's all in the past and maybe you're safe now, and stumbling across a picture of yourself being raped at the age of five, or whatever, or worse still, having friends or family or spouse or children stumble across it.

So speaking only for myself, I think CSAM is a moral crime whenever it is accessed, one of the most hellish that can be committed against another human being, regardless of the specificities of the law.

I don't have a problem with much else that people share, but goddamn I do have a problem with that.