Some of you might have seen me make a comment on here asking what would happen if someone used AI to generate photorealistic child pornography.
Well, unsurprisingly, that question has now found its way into the news:
https://arstechnica.com/tech-policy/2024/05/csam-generated-by-ai-is-still-csam-doj-says-after-rare-arrest/
>The US Department of Justice has started cracking down on the use of AI image generators to produce child sexual abuse materials (CSAM).
>On Monday, the DOJ arrested Steven Anderegg, a 42-year-old "extremely technologically savvy" Wisconsin man who allegedly used Stable Diffusion to create "thousands of realistic images of prepubescent minors," which were then distributed on Instagram and Telegram.
>The cops were tipped off to Anderegg's alleged activities after Instagram flagged direct messages that were sent on Anderegg's Instagram account to a 15-year-old boy. Instagram reported the messages to the National Center for Missing and Exploited Children (NCMEC), which subsequently alerted law enforcement.
>During the Instagram exchange, the DOJ found that Anderegg sent sexually explicit AI images of minors soon after the teen made his age known, alleging that "the only reasonable explanation for sending these images was to sexually entice the child."
>According to the DOJ's indictment, Anderegg is a software engineer with "professional experience working with AI." Because of his "special skill" in generative AI (GenAI), he was allegedly able to generate the CSAM using a version of Stable Diffusion, "along with a graphical user interface and special add-ons created by other Stable Diffusion users that specialized in producing genitalia."
>After Instagram reported Anderegg's messages to the minor, cops seized Anderegg's laptop and found "over 13,000 GenAI images, with hundreds—if not thousands—of these images depicting nude or semi-clothed prepubescent minors lasciviously displaying or touching their genitals" or "engaging in sexual intercourse with men."
>In his messages to the teen, Anderegg seemingly "boasted" about his skill in generating CSAM, the indictment said. The DOJ alleged that evidence from his laptop showed that Anderegg "used extremely specific and explicit prompts to create these images," including "specific 'negative' prompts—that is, prompts that direct the GenAI model on what not to include in generated content—to avoid creating images that depict adults." These go-to prompts were stored on his computer, the DOJ alleged.
>Anderegg is currently in federal custody and has been charged with production, distribution, and possession of AI-generated CSAM, as well as "transferring obscene material to a minor under the age of 16," the indictment said.
>Because the DOJ suspected that Anderegg intended to use the AI-generated CSAM to groom a minor, the DOJ is arguing that there are "no conditions of release" that could prevent him from posing a "significant danger" to his community while the court mulls his case. The DOJ warned the court that it's highly likely that any future contact with minors could go unnoticed, as Anderegg is seemingly tech-savvy enough to hide any future attempts to send minors AI-generated CSAM.
>"He studied computer science and has decades of experience in software engineering," the indictment said. "While computer monitoring may address the danger posed by less sophisticated offenders, the defendant's background provides ample reason to conclude that he could sidestep such restrictions if he decided to. And if he did, any reoffending conduct would likely go undetected."
>If convicted of all four counts, he could face "a total statutory maximum penalty of 70 years in prison and a mandatory minimum of five years in prison," the DOJ said. Partly because of "special skill in GenAI," the DOJ—which described its evidence against Anderegg as "strong"—suggested that they may recommend a sentencing range "as high as life imprisonment."
>Announcing Anderegg's arrest, Deputy Attorney General Lisa Monaco made it clear that creating AI-generated CSAM is illegal in the US.
>"Technology may change, but our commitment to protecting children will not," Monaco said. "The Justice Department will aggressively pursue those who produce and distribute child sexual abuse material—or CSAM—no matter how that material was created. Put simply, CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive, and increasingly photorealistic images of children."
Did I turn this into another op-ed on Medium? Of course I did:
https://medium.com/@MoonMetropolis/how-should-law-enforcement-handle-fake-ai-generated-child-pornography-2ceb8f1ded20
You can expect much more drama to flow from this in the coming years - it will almost certainly find its way to the US Supreme Court, and there is really no telling, at this point, how they will rule on it.
Jump in the discussion.
No email address required.
the person is scum regardless but it will be an interesting case on the technology because ultimately it's as abusable as the person using it, would they also go to jail if they used a pencil?
Jump in the discussion.
No email address required.
Pleaaaase put shadman on death row
Jump in the discussion.
No email address required.
!coomers defend ur brown H addict king
Jump in the discussion.
No email address required.
nah he can go on death row, he wasn't just drawing fictional CP he was dealing in the real stuff too
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
When? The last I heard, he burned a few bridges during COVID and hasn't really done anything since then.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
I fapped to his horrible art when I was a kid not Loli stuff though
Jump in the discussion.
No email address required.
Remember the dragon one?
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
Jump in the discussion.
No email address required.
More options
Context
I thought you meant sneedman and was so confused... unless
Jump in the discussion.
No email address required.
why not both
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
Iirc the rule was always that photorealistic child porn regardless of origin was illegal, based on the argument that otherwise they would have to dig up wherever every piece of child porn they charged someone for owning came from.
Jump in the discussion.
No email address required.
also keeps pedos from putting real CP through a bunch of filters to claim it's not real
Jump in the discussion.
No email address required.
More options
Context
More options
Context
You can't draw cp that realistic. It just isn't humanly possible. Stable Diffusion can make images that are photorealistic. It takes no effort at all. You can even set your computer to generate 100 sets of 8 in like an hour.
Jump in the discussion.
No email address required.
More options
Context
wasn't there some woman in america who got jail time for writing kiddie smut fanfiction back in the 80s? like she had a website for it
Jump in the discussion.
No email address required.
Stephen King should get jail time then.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Nah, the feddies didn't even add that to the charges, t hey probably figure SCOTUS will rule like they did for Ashcroft v. American Civil Liberties Union
Jump in the discussion.
No email address required.
More options
Context
...
Jump in the discussion.
No email address required.
More options
Context
More options
Context