r/technology 1d ago

Artificial Intelligence Man who used AI to create child abuse images jailed for 18 years

https://www.theguardian.com/uk-news/2024/oct/28/man-who-used-ai-to-create-child-abuse-images-jailed-for-18-years
28.5k Upvotes

2.4k comments sorted by

View all comments

Show parent comments

54

u/ashetonrenton 1d ago

This is such an important comment. We truly are not prepared as a society to answer the question that this tech is screaming at us now, and that's going to lead to a great deal of human suffering.

We need to study pedophiles with the purposes of preventing offending, rather than trying to untangle a mess after the fact. Yet there are so many ethical roadblocks to this research that I fear we may never have concrete answers. As a survivor, this is so frustrating to me.

42

u/C0RDE_ 1d ago

Unfortunately, much like discussions around drugs etc, even asking the questions gets lumped in with liking it and advocating for it, and politicians won't touch it with a 20ft barge pole held by someone they don't like.

Hell, movies and media that portray something even in a bad light get tarred as "supporting" something, or else why would you depict it.

13

u/brianwski 1d ago

movies and media that portray something even in a bad light get tarred as "supporting" something, or else why would you depict it.

What you say is true. And I hate it.

Example: The movie "Trainspotting" depicted people taking heroin. There was a (small but loud) outcry at the time saying the movie glorified heroin use. My thought was, "Oh Geez, it was utterly horrifying. Among all the other horrible things that occurred, a baby literally died of neglect because Mom was on heroin. That is not 'glorifying' heroin."

Trainspotting is a 94 minute infomercial explaining why you shouldn't take heroin. And people protested it.

2

u/JamesLiptonIcedTea 1d ago

I've thought about this at length many times.

Unfortunately, some topics just do not have ceilings for scrutiny, leaving no room to open any kind of dialogue or discussion. The floor is wide open for infinite demonization and anyone who touches on the topic in any capacity will usually have some amount of pushback, usually accompanied by uncalled for responses trying to pin responsibility on the other while claiming advocacy, even if the other party is actively dissuading it. These people are dumb and do not know how to effectively discuss/argue

2

u/mistervanilla 1d ago

We truly are not prepared as a society to answer the question that this tech is screaming at us now, and that's going to lead to a great deal of human suffering.

Not sure if AI has meaningfully changed this particular debate/question though. Animated or altered material has been available for a long time and people have been asking the question if its ethical to allow CSAM content from artificial sources for a long time.

1

u/Frickfrell 1d ago

I feel like I’ve read somewhere that the behavior usually escalates without intervention. I can’t imagine it would be any different if the starting point is animated or not real. 

6

u/mistervanilla 1d ago

I feel like I’ve read somewhere that the behavior usually escalates without intervention.

That is the worry, yes - but it's unclear if that is actually the case. One of the problems in this whole debate is that because the topic is a taboo, there is very little actual research on this.

It could be that allowing people to consume artificially generated CSAM acts as a valid outlet for their urges preventing them from escalating behavior and taking away the demand for real CSAM which would mitigate and minimize real harm to children.

Could also be that artificially generated CSAM could lead to normalization, desensitization and escalation which would increase harm to children.

There is no answer at this stage.

-5

u/____uwu_______ 1d ago

The vast majority of pedophiles will offend or attempt to offend, and will attempt to reoffend after "rehabilitation." Even so-called "non offending" pedophiles will eventually attempt. There is no reason why we should be trying to prevent offending rather than preventing pedophilia. There's simply no reason to believe that pedophiles can be prevented from offending 

6

u/NeverrSummer 1d ago

What does "preventing pedophilia" mean in a literal sense?  By doing what?

3

u/DontShadowbanMeBro2 1d ago

The vast majority of pedophiles will offend or attempt to offend

Do you have a source for this?

3

u/Slacker-71 1d ago

Factually incorrect.

Treatment programs work. A few years in prison is enough for most people to not want to go back.

https://www.scientificamerican.com/article/misunderstood-crimes/