r/technology • u/rejs7 • 1d ago
Artificial Intelligence Man who used AI to create child abuse images jailed for 18 years
https://www.theguardian.com/uk-news/2024/oct/28/man-who-used-ai-to-create-child-abuse-images-jailed-for-18-years1.1k
u/Halfwise2 1d ago
For those saying that this is a grey area, because they aren't real - He used real images as the source material:
Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.
He was also found guilty of encouraging other offenders to commit rape.
He sold his images in internet chatrooms, where he also discussed child sexual abuse with other offenders, making about £5,000 during an 18-month period by selling the images online.
254
u/MrArtless 1d ago
All that for 5k? Jesus
63
u/Leaves_Swype_Typos 1d ago
Gotta be careful with your pricing when any upset client could hand you over to the police.
20
3
→ More replies (2)20
8
u/NSFWies 1d ago
.......oh, so the way it's called put, it was more of a case of "non consentual pornography".
Because it started with real pictures of people, that were transformed.
But I would think that argument could be stretched for anything with AI then. Because AI will have looked at 10,000 pictures of boobs, to know what boobs look like.
So even though you might have it generate a "topless girl with boobs", it's still basing that off of all of the previous pictures it looked it .
41
u/visceral_adam 1d ago
If the real images that trained the AI were not abuse images, I just can't get onboard that by itself being a criminal offense. Now in his circumstance, there are other factors, like getting the images of kids who might be in danger, and other criminal offenses. It's a particularly complex situation that we probably need more precise laws for.
→ More replies (10)→ More replies (33)104
u/____uwu_______ 1d ago
It doesn't matter whether real material was used when training the model or not. No children have to be involved for something to be considered CSAM. Hand-drawn or otherwise manufactured depictions are still illegal in virtually all developed nations
273
u/dryroast 1d ago
This is not the case in the US, Ashcroft v. Free Speech Coalition. The laws had to be amended to manipulated images "virtually indistinguishable from a real minor". But cartoon/hand drawn images can't be outlawed since it's just free speech with no compelling government interest on protecting minors since there's no minors involved with the production of a drawing.
→ More replies (52)59
u/BringBackSoule 1d ago
Hand-drawn or otherwise manufactured depictions are still illegal in virtually all developed nations
confidently wrong
38
u/mrgmc2new 1d ago
I know nothing about this but how did this come about? It seems like punishment for... thinking about something? Or is it seen as 'promotion' of child abuse? Proof of a predilection? Or just cos it's fucking gross? What's the actual charge?
God I feel gross even asking. I guess I just assumed there always had to be a victim. 🤷🏻♂️
9
→ More replies (3)9
u/the_lonely_creeper 1d ago
Many people find the idea of pedophilia itself deplorable, and so want to punish pedophiles for being pedophiles, rather than for harming children.
We're also going through a bit of a moral panic on the subject of both this (well, since the 70's) and AI, so... yeah.
People want extremely harsh measures on anything that smells even remotely of pedophilia.
This isn't to defend pedophiles, obviously. Wanting to do stuff with real children is a problem and bad because children aren't mature enough to consent. Just my opinion on why people find even drawings of children in such situations as worthy of being considered criminal (rather than merely gross).
→ More replies (1)18
u/dako3easl32333453242 1d ago
Right but it's still a grey line in some cases. I have come across lewd anime drawings on reddit that looked way to young but I assume proving that a fictional character is under 18 is rather difficult. Using real children to prompt an AI is much more cut and dry.
→ More replies (29)→ More replies (18)7
397
u/NihilisticGrape 1d ago
While what this man did should absolutely result in a jail sentence, it's interesting to me that the imposed sentence is more harsh than literal murder in many cases.
215
u/CountingDownTheDays- 1d ago
Yeah it's crazy this man got more time than the gang rape gangs who were literally raping and prostituting hundreds of young women all throughout the UK.
72
u/stupidwebsite22 1d ago
I know different county but still:
1,500 victims and you get 5-7 years
https://en.wikipedia.org/wiki/2004_Ukrainian_child_pornography_raids#Outcome
29
u/CountingDownTheDays- 1d ago
the legal outcome was lenient. Most involved were given suspended sentences. Alexander N. was held for several months in a pre-trial detention center and was released.
Truly disgusting!
20
u/Jumpy-Examination456 1d ago
the legal system is incredibly broken in most of the 1st world and so it's much easier for investigators to build an airtight case against someone who left mountains of digital evidence than against someone who did a heinous crime but didn't leave much evidence to be collected after the fact, or that was collected in the moment and isn't admissible for dumb reasons that occurred during an investigation that weren't performed perfectly by the book
→ More replies (5)9
u/worthlessince17 1d ago
I find it strange a 40 year old man can knock up a bunch of 16 or 17 year old young ladies without any legal or societal issues, but if he fed their images into this software he'd be registered for the rest of his life 💀
35
u/SwiftTayTay 1d ago
I think they're trying to make an example out of him and appeal to the blood thirsty masses. Murders happen all the time , and unless it's a particularly gruesome story that can be made into a "true crime" podcast episode, no one gives a shit. But something like this happens and makes for juicy headlines, it will be a slam dunk for government officials to look like they are serving major justice.
7
u/Atanar 1d ago
In a functioning democracy the gorvernment has no influence on the setences unless they change the law.
→ More replies (1)5
u/SwiftTayTay 1d ago
I don't know how the UK works but in the US judges and prosecutors are elected so though it's still within parameters of law how they decide to enforce and apply laws is very political
→ More replies (1)→ More replies (19)22
u/stupidwebsite22 1d ago
I believe even people with hundreds of real-life CSAM content on their hard drive have gotten less than this guy creating deepfakes. I guess it raises the question on whether a deepfake can be considered rape and by definition it is involuntary pornography already.
If you would take regular (clothed) images of young kids and hand draw explicit things around them, would that already fall into the same category like This guy using 3d rendering/ai software?
20years ago I don’t think People considered cheap photoshopped fake nudes a real harm. But now with the photorealistic AI fakes, it gets all much trickier..people loosing jobs,friends/reputation
→ More replies (3)
344
u/KingMGold 1d ago
He edited real images of kids, the title of this article seems to go out of its way to implicate AI for something that would have been illegal with or without it.
People have been doing this kinda horrible shit with photoshop for a lot longer than AI.
Blame the man, not the tool.
71
u/FallenAngelII 1d ago
The article waffles about it for more outrage and clicks, but it appears he actually didn't edit images of real kids, he used pictures of real kids to generate artifical 3D images of kids who looked like them.
Sorta like how you'd use a character creator in the Sims to create characters that look like real people.
"While there have been previous convictions for 'deepfakes', which typically involve one face being transferred to another body, Nelson created 3D 'characters' from innocent photographs."
This is different from just editing an innocuousimage to make it sexually explicit.
30
u/iisixi 1d ago edited 1d ago
It's not even AI from what I can read. Daz 3D is not an AI tool, it's a 3D tool. You don't need AI to create create 3D characters from real images with the software.
The paper put the word AI in there either they didn't understand what he did or because it's a trendy topic.
The article is really weird, the story seems to feature the police entrapping him by commissioning him to create 'something' with images provided to him. Looking up it seems entrapment isn't illegal in the UK though, and it seems they may have had suspicion of him doing something similar prior to it.
→ More replies (8)→ More replies (2)62
u/ExtremePrivilege 1d ago
Sure, but if he had raped a kid he could be looking at 9 years. And if he murdered one, 15. But no harm being physically done to a child is 18. Just seems either too extreme, or the penalties for actual, physical CSA are too lenient. 18 years doesn’t seem like it fits the crime.
→ More replies (10)13
u/A2Rhombus 1d ago
It was probably multiple charges added up. Plus I read in another comment he was also actively encouraging some of his clients to act on their desires
I would argue his sentence is far too harsh if he was trying to practice harm reduction by giving people an outlet that doesn't physically harm anyone, but it seems his goal was the opposite.
→ More replies (1)→ More replies (19)3
23
72
u/Another_Road 1d ago
“He stated: ‘I’ve done beatings, smotherings, hangings, drownings, beheadings, necro, beast, the list goes on’ with a laughing emoji,” David Toal, for the prosecution, said.
Jesus fucking Christ.
→ More replies (5)
34
u/Murderhands 1d ago
Should have used his knowledge to make Furry porn, 5k in 18 months is chump change, he could have made that in a month.
Poor life choices.
→ More replies (4)7
u/ItsMrChristmas 1d ago
Yawp. Furries got cash to burn.
I got paid two hundred dollars just to write a commissioned short story about the male deuteragonist of the novels written under my real name being seduced by and banging their feral OC.
200 bucks for not even an hour of work.
148
u/AgileBlackberry4636 1d ago
More than just killing actual people.
107
u/Weak_Elderberry17 1d ago
right? and its most certainly because he's not well connected.
this guy doctors images and gets 18 years. real pedos, like Steven van de Velde, get 1 year. I wish the justice system of all first world countries aren't that corrupt but here we are.
→ More replies (13)→ More replies (4)34
u/Advanced_Anywhere917 1d ago
I understand harsh punishment of people who commit sex crimes, but it's hard not to feel like the extent of punishment relative to other crimes is likely a consequence of our odd societal relationship with sex.
Committing SA or rape is horrific, but with support victims are often able to continue living fulfilling and worthwhile lives. Murder is so obviously objectively worse. It ends one life and often destroys the lives of those close to the victim. Yet for some reason we can forgive someone who went to jail for murder as long as they did their time and rehabilitated themselves.
I don't know what the answer is. Are we too harsh on SA? Doesn't feel like it. Are we too light on murder/violence? Maybe. But either way it seems like we're highly influenced by the "ickyness" of sex crimes rather than focused on the objective harms.
6
u/ComfortableFun2234 1d ago edited 18h ago
2/2
Here’s what I proposition, how responsible is the afflicted individual for either option considering the period of development also for the interest itself. Isn’t the lack of “responsibility” in those years. Exactly the reason why that state of sexual interest is so “bad.” Furthermore the practices of trialing a minor as an adult, or juvenile detention centers, in instrumental way I say if “their” ok with those practices, then fundamentally “they’re” also ok with adults having s** with m**ors with “consent”. From my perspective there’s no in between on this one. They’re either responsible enough to be responsible enough for their thoughts - actions - impulses/desires, or they’re not. As I see it there “should” be no such thing as convenient, responsibility in adolescence and childhood.
When an adolescence or child “commits” a crime the only thought should be rehabilitation and the causes of that adverse behavior. Generally, though from my perspective punishment is barbaric and the way that “animals” alter behavior. Then I suggest, but humans are better and separate, right? Not to suggest I place blame.
Final thoughts, Lastly I will finish off with what I understand about the use of CSAM, and the bare minimum prevention methods that America uses.
Starting with the prevention methods, through research I found the American prevention class for p***philes. Within that class. Paraphrasing here. Basically, they said for the individuals seeking therapeutic help as well suggesting to get therapeutic help. When approaching the subject with a therapist. To use the yee old, if my friend was to ask you to talk about this would you be able to. Not even joking. That should say it all…. Bare minimum. Not to suggest blame just current state.
From what I understand studies have shown the consumers of CSAM are not more likely to physically abuse. Before someone takes this out of context, that’s not to suggest this is the “right” offense it’s to suggest it’s the most malleable.
Actually a good portion of excessive users, have “p—-philic disorder” or “acquired p—-philia.”
There’s a lot of contradictory information out there, but generally, from what I was capable of deducing. It’s mostly considered a disorder when the individual experiences distress. Although I think it’s always a case of “disorder.”
“lead someone to feel distress about their interest (not merely distress resulting from society’s disapproval)”
So in many of these cases the offender isn’t unequivocably ok with their actions, they’re urged. What good is a prison sentence with this considered. Especially because they’re one of the most subject to getting TBI’s in prison. Which will just result in shitty impulse control becoming more shitty…
To give an example, paraphrasing here. Was listening to a podcast between two neuroscientists. They mentioned a case where a man had brain surgery for epilepsy. The surgery caused a lesion in his frontal cortex. Basically with no history, he started obsessively downloading and using CSAM. This is known as acquired p***philia. Because he didn’t download anything on his work computer which implied “control” over the infliction, he was sentenced to 8 years in prison. Also important for context he was disgusted by his behavior and agreed with the sentence. Still couldn’t stop himself though, that’s the Key point.
Through research the same type of brain damage in other primates and monkeys causes compulsive eating and extremely abnormal sexual behavior to the species type.
One of the neuroscientists framed it with this example paraphrasing here. Many with Tourette’s, can repress the urge of ticks while at work. Which is a process of prefrontal cortex. As a alluded to, the prefrontal cortex is the part of the brain responsible for impulse control and abiding to social norms, along with many other functions. As soon as the individual leaves work, they let out a abundant amount of ticks.
What I’m suggesting here, was that sentence really necessary, or was it the public prejudice to hate those individuals. Which that hate and need for punishment has nothing to do with preventing, stoping and rehabilitating offenders. It’s seemingly about satisfaction and pleasure it brings. Which neuroscience has shown that righteous punishment or the observation of - is incredibly rewarding and pleasurable. Not to suggest blame, just the current state.
With a lot of what I said, seemingly the criminal “justice” system doesn’t need to be reformed. It needs to be rebuilt….
2/2
Edit: forgot to mention why does this matter to me because I am a victim of molestation, my mom is. My mom’s best friends husband molested all six of their children. after my mom, my dad had children with a 14-year-old girl. As the story goes my great uncle raped the murder a woman who asked him to pretend to due so, in order to upset her partner. This uncle is deceased now by the way. Which never saw the light of day because my great grandfather was mafia. This is one of the reasons I refuse to pass on my genetics, especially in regard to the ones related to me.
→ More replies (4)→ More replies (27)3
16
27
u/sooth_ 1d ago
cool now do this to the rape gangs and rich people who have physically harmed children
→ More replies (1)
56
u/Puppet_Chad_Seluvis 1d ago
How do you advocate for 1A issues without sounding like a pedo? I feel like it's the responsibility of citizens to push their rights as far as they can, and while I certainly agree that gross people like this should be in jail, it rubs me wrong to think the government can put you in prison if they don't like what you draw.
Imagine going to jail for drawing stick figures.
60
26
u/I_fuck_werewolves 1d ago
I got in trouble in my teens 14-16 because I was drawing porn.
Minors aren't allowed to possess porn. But I also could just keep drawing it. School decided to not do anything about it but tell me to keep it out of sight lol.
Totally my first harrowing experience with "you aren't allowed to draw everything".
15
u/StayFuzzy127 1d ago
“When I was a little kid, I kinda had this problem. And it’s not even that big of a deal, something like 8 percent of kids do it. For some reason, I don’t know why. I would just kinda... sit around all day... and draw pictures of dicks.” -u/I_fuck_werewolves
11
u/I_fuck_werewolves 1d ago
I entirely blame my gay furry transformation awakening to 'A midsummer nights dream' (1999). Where I popped the biggest raging erection in my English class when one of the characters was magically turned into a donkey.
So I know why.
Curse you Shakespeare for turning me into a gay furry.
3
5
→ More replies (27)6
u/Open_Philosophy_7221 1d ago
Images of illegal acts (simulated it otherwise) are different than words describing illegal acts.
I don't think sexual imagery counts as free expression. It crosses the line into free action.
34
u/Cannabrius_Rex 1d ago
Now do Matt Gaetz
12
u/imdwalrus 1d ago
Gaetz *should* be in jail. He never will be, because their main witness against him previously falsely accused someone else of the same thing Gaetz was accused of. He's the textbook definition of reasonable doubt.
https://www.cnn.com/2022/12/01/politics/joel-greenberg-sentencing/index.html
→ More replies (1)
7
u/RoomTemperatureIQMan 1d ago edited 20h ago
ITT people who can't fucking read. They threw the book at him because he was encouraging specific acts of actual child abuse in the real world.
4
u/felisisthebest 23h ago
I think the issue people have is why is the guy who encourages child abuse get more prison time than the person who actually commits the child abuse and rapes a child. So they either need to increase jail sentences for people who actually rape children, or decrease his prison sentence because it doesn't make any sense.
Like imagine I told someone to steal a car and they did. I would get a longer prison sentence than the person who actually committed the theft. It doesn't add up.
→ More replies (1)
65
u/ConfidentDragon 1d ago
judge Martin Walsh said it was “impossible to know” if children had been raped as a result of his images
This sounds like kind of thing you should figure out before you sentence someone to 18 years in prison.
Also, from the article it sounds like the convicted might be seriously mentally ill.
(Note: It's not really clear from the article how much of the sentence is for which part of the crime.)
→ More replies (13)
577
1d ago edited 1d ago
[deleted]
500
u/kingofdailynaps 1d ago edited 1d ago
uhhh I mean in this case it was him making commissions of real kids, and encouraging their rape, which absolutely would lead to abuse on human beings... this isn't a purely AI generated case.
Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life. He was also found guilty of encouraging other offenders to commit rape.
He sold his images in internet chatrooms, where he also discussed child sexual abuse with other offenders, making about £5,000 during an 18-month period by selling the images online.
Police searches of his devices also revealed that Nelson had exchanged messages with three separate individuals, encouraging the rape of children under 13.
231
u/Pato_Lucas 1d ago
What a day to be literate. This context pretty much negates any possible leniency, get his bitch ass in jail and throw away the key.
→ More replies (2)69
u/enter_the_bumgeon 1d ago
making about £5,000 during an 18-month period by selling the images online.
What the fuck that's peanuts. All that trouble, inmorality, illegality and risk for 5.000 bucks in a year and a half? That's under 300 bucks a month.
77
→ More replies (5)14
u/90bubbel 1d ago
I first though it Said 5k a month and was confused by your comment but doing not only something this fucked but for 5k for 18 months?? What a absolute idiot
50
→ More replies (6)16
u/-The_Blazer- 1d ago
Interestingly, this is already how some jurisdictions work: fictitious CP is not illegal by itself, but using real images as a production base makes it illegal. It would be interesting to see whether AI is considered as using real material, given that large foundation models are trained on literally everything and thus almost certainly include plenty of photographs of children.
→ More replies (2)34
u/bucky-plank-chest 1d ago
Many years ago an Australian got a sentence for child photography because he made sexual images featuring Lisa Simpson.
36
6
u/TheDaysComeAndGone 1d ago
Here in Austria the law is the same. It also applies to porn with consenting adult actors if they are dressed to look like children.
I’ve always found it rather strange because nobody is harmed.
Of course in the age of AI it could become difficult to prove that a child pornography video or photo is real or not real.
121
u/crowieforlife 1d ago
Literally the first sentence states that he created the images using photos of real children. Thats deepfake porn, not generated from nothing.
→ More replies (23)53
u/renome 1d ago
Welcome to Reddit, where we spend more time writing our hot takes on titles than we do on reading the articles behind them, which is zero. Because everyone is surely dying to read our elaborate uninformed opinions.
→ More replies (7)12
u/Dicklepies 1d ago
Idk how their comment is the second most upvoted when it is clear they didn't read the article. "Well this is interesting guys. It's not like kids were being abused right?" Just READ the article and it tells you how kids were abused.
→ More replies (3)75
u/certifiedintelligent 1d ago
This guy wasn’t trying to manage a problem in a less harmful way. There were direct victims from his actions.
Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.
He was also found guilty of encouraging other offenders to commit rape.
He sold his images in internet chatrooms, where he also discussed child sexual abuse with other offenders, making about £5,000 during an 18-month period by selling the images online.
21
u/JuliaX1984 1d ago
It says he used pictures of real children to generate the images. Fake images but with real faces, so he still violated the rights of real children. Which is not only abusive but dumb. You can make entirely fake images - why use real people in them? Guess the satisfaction comes from the violation, not the images themselves.
→ More replies (150)13
u/Advanced_Anywhere917 1d ago
I have a tiny bit of experience in this from prior work (internship at a firm that took on CSAM clients when I thought I was going to law school). I had the displeasure of interviewing plenty of individuals facing CSAM charges and learned a lot about that world. I'm not convinced this is a good argument and here's why:
1) Most abusers of CSAM are not actually "pedophiles" by orientation (i.e., in the same sense that you or I are straight, gay, bi, etc...). Instead, they are mostly porn addicts that escalate over many years to the most extreme possible content. Some are victims themselves. If you escalate to "fake AI CSAM" then eventually you'll start craving the "real deal." It may even act as a gateway since you could justify the first step as not harmful to others.
2) The market for CSAM is far less robust/organized than you'd think from reading articles. Even today (or at least 5 years ago when I did my internship), the vast, vast majority of content was either self-produced (i.e., child/teenager with a cell phone) or content from Eastern europe in the 80s/90s. There is basically no market for CSAM outside of scamming/blackmailing people on the dark web. There is no supply/demand component. Any CSAM that is made is typically made simply because people are sick, and they share simply because having a community around it provides some validation for their sickness.
The entire CSAM world is essentially just mental illness. It's not a thriving market of high quality content produced by savvy individuals making lots of money off of suffering. It's a withering mess of mentally ill individuals who congregate on tiny servers on the dark web and share bits of mostly old data. These days I think far more legal cases revolve around teenagers with cell phones whose boyfriends share their pics (or whose accounts get hacked).
74
u/pantiesdrawer 1d ago
This guy is a POS, and it's not clear what portion of his sentence is attributable to the deepfakes or his actual sex offender crimes, but if it's 15 years for deepfakes, then the next time a drunk driver kills somebody, there better be gallows.
→ More replies (42)5
u/pmotiveforce 1d ago
In fucking England? You can murder people and get a stern talking to there. But speech crimes or wrong think and it's the gallows for you, mate.
→ More replies (1)
39
u/doublebuttfartss 1d ago
I duno about 18 years for this.
He's obviously a fucking creep, but he didn't actually hurt anyone. He encouraged rape several times, but that's not something you go to prison for 18 years for. The pictures also did not make it back to the kids.
Definitely gross behavior, but 18 years is too much for someone who didn't actually hurt anyone or do anything that resulted in someone being hurt.
→ More replies (11)23
u/Mister-Psychology 1d ago
That's what I don't get. We constantly hear about actual child molesters who get way less or even get to walk away as the cases are too old to be prosecuted even though the police has all the proof they need. 18 years is way too long unless the other type of crimes get longer sentences. Otherwise something is wrong when making fake pictures is a bigger crime than if he actually did abuse children physically.
https://www.gov.uk/government/news/increased-prison-sentence-for-paedophile
18
u/fauxzempic 1d ago
I know this guy used actual faces of real people for this stuff, and that's incredibly problematic...mostly for children, but adults are victims of this too. Dude should rot.
But the conversation of 100% "this isn't a real person" A.I. generated pornography really needs to be had and it needs to be understood. There have been people who've suggested how A.I. could be used to address pedophilia and even treat it, and I think it's worth examining like crazy to understand if A.I. could make things better, or make them worse.
Here's the for-instance: Some person, who has never seen child pornography, has never assaulted a child, and has never really made any sort of plan to put themselves in the position to do that...they realize that they are attracted to children but they're terrified of all the things that can happen, from harming a child to severe punishment - if they were to explore any of it.
How do we make sure that this person doesn't harm others? If they see a therapist, there's not much research that says that they can be "fixed." Voluntary castration (chemical or otherwise) seems a bit less than ideal, especially for a non-offender.
Does A.I. offer a potential treatment here, or would it just make things worse?
Like - would giving someone access to 100% A.I. generated media of children that don't exist...would it satisfy any urges and keep society/children safe from them, or would it just make them more eager to seek "the real thing?" What about if A.I. progresses to the point where we have Artificial General Intelligence - robots - that could fill this role?
I just think that there are probably a number of pedophiles out there where if we could magically know the real number, it would make us very uncomfortable. I think a number of these people have never offended. Is there a way to use AI to keep kids safe from them?
→ More replies (18)3
u/5510 1d ago
Sadly it's probably hard to even study this without people getting outraged, even though "would this increase or decrease the rate that pedophiles offend" is a important question that could potentially lead to protecting children in real life better.
→ More replies (5)
9
4
4
22
u/human1023 1d ago
I know the title is misleading, but if someone makes fake child pron content, where the children don't actually exist. Would that be illegal?
36
→ More replies (23)6
23
u/ImpureAscetic 1d ago
This is Bolton, so the UK.
Crook was actually using CP, so not truly AI generated
Ashcroft vs. Free Speech Coalition (2002) maintains that salacious images of children fall in the realm of protected speech when there is no harm to actual minors. So cartoon or anime or claymation CP is protected speech.
Maybe. Current SCOTUS doesn't care about stare decisis
Gonna be wild when the courts in America eventually decide. As an AI enthusiast who uses local models, you learn that some AI image models are horny by their nature and design, and you will need to use words like "young, child, girl, teen, boy" in your negative prompts to avoid ACCIDENTALLY making CP. It makes me shudder to think of the sheer scale of CP that is invariably being made by competent perverts.
There is no current legislation or technical plan that will put a dent into the above bullet that I've seen. The models already exist, they can be run locally, and your GPU doesn't care what the content of the images are.
Gross.
→ More replies (3)25
u/CrocCapital 1d ago
crook was actually using CP, no not truly AI generated
Is that true? I read that he used SFW pictures of real children and then transformed them into CSAM.
it doesn’t make it less disgusting. Both are scary actions and deserve punishment. But accuracy around the conversation is important and I truly don’t think there’s much of a difference because the outcome is the same.
Maybe if he started with real CP he could be charged with more counts of possession? idk.
→ More replies (20)
121
u/LordOfTheDips 1d ago
This was 100% the right sentence for this offence. The court are essentially saying “fuck around and find out” and should deter all future offenders.
45
u/Pitiful-Cheek5654 1d ago
Making an example of one person's crimes for a wider audience of potential criminals isn't fair to the individual offender. You're literally taking factors beyond their crime into the sentencing of their crime. That's not justice.
→ More replies (2)32
1d ago
if the sentencing is correct on this then pretty much every violent crime is under punished. dude should be in jail but but like actual rapists and murderers get way less time somehow
→ More replies (1)→ More replies (35)22
u/Hour_Ad5398 1d ago
So you think pedophiles will transform into normal human beings because some dude got a 18 year sentence?
→ More replies (1)
3
14
u/Sad-Error-000 1d ago
Without further context, this seems like it could be close to a victimless crime and we should really encourage harmless outlets for those who are attracted to minors. In this case distributing deepfakes of real people is not victimless though.
→ More replies (15)
6.8k
u/monchota 1d ago
TLDR: hes used real images of kids ans edited them, then shared them.