this post was submitted on 20 Jun 2023
33 points (97.1% liked)

World News

38755 readers
2525 users here now

A community for discussing events around the World

Rules:

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
 

cross-posted from: https://vlemmy.net/post/153082

Disclaimer: No images are used in the article.

top 25 comments
sorted by: hot top controversial new old
[–] fraval 18 points 1 year ago (5 children)

I am not saying that this is a good thing, but rather generated by AI than the real thing... Still fuckedup though.

[–] CitizenKong 7 points 1 year ago (1 children)

The really sick aspect about this is that someone fed the AI with probably thousands of real child porn images to generate the fake ones.

[–] Protegee9850 -1 points 1 year ago (2 children)

That’s a hell of a leap and seems to be based in ignorance of the technology.

[–] LufyCZ 2 points 1 year ago (1 children)

Your comment is based in ignorance of the technology. To have AI spit out images of a specific type, you also have to first feed it imagines of said type.

[–] Protegee9850 7 points 1 year ago* (last edited 1 year ago)

Again, you’re obviously ignorant of how this stuff actually works. That is simply not the case. Otherwise the training set would necessarily need to have images of every type that you hope to generate, an impossibility and which obviously isn’t the case - a very quick look at some of the crazier things people have generated disprove it. Training the model on nude and clothed images of adults and clothed images of children - as others have pointed out - would allow you to generate nude images of children. Could a model have been fine tuned with CSAM - yes; but it’s certainly not a given, and probably not necessary.

The stable diffusion sub has somewhat migrated over to the fediverse. You can find more information about how this stuff actually works beyond your introductory understanding of the concept there.

[–] ToastyWaffle 1 points 1 year ago

How do you think AI machine learning works? It's all based on Large Language Models aka a shit ton of real data.

[–] [email protected] 4 points 1 year ago (1 children)

...how did they train the model though?

[–] mido 10 points 1 year ago* (last edited 1 year ago)

Not (necessarily) with real naked children. With kids with clothes, adults with clothes, and naked adults.

It's not hard for the AI to transpose from a clothed child to a naked one, it's basically the same thing as switching your gender or making you look old

[–] [email protected] 1 points 1 year ago

It's equally illegal, at least in my country.

[–] MercuryUprising -1 points 1 year ago (1 children)

Its still fucked up, because it starts with shit like this before moving into real world encounters. Over time the predators brain will see it as a new normal and will want to escalate.

[–] Noedel 6 points 1 year ago (1 children)

Do you know this or do you think this? I'm relieved I don't have much insight into the mind of pedos... But couldn't it be the other way around too?

[–] SterlingVapor 6 points 1 year ago

Access to pornography lowers incidence of sexual assault

So it very much looks to be the other way around

[–] trachemys -2 points 1 year ago (2 children)

You’ll still go to jail, even if it is fake, even if it is a cartoon character.

[–] phx 6 points 1 year ago

Depends on the country. Some only criminalize depictions where children were exploited or harmed, so the cartoon stuff might get a pass despite being nasty. AI images I'd imagine it might be hard to prove they aren't real children and at that point might be treated like a robbery with a fake weapon or selling fake drugs (still chargeable as the real thing in most places)

[–] Aiastarei 6 points 1 year ago (1 children)
[–] trachemys 3 points 1 year ago

The bart simpson porn case I think was UK.

[–] stanleytweedle 12 points 1 year ago (1 children)

"Roughly 80 percent of respondents" to a poll posted in a dark web forum with 3,000 members said that "they had used or intended to use AI tools to create child sexual abuse images,"

wtf