Unable to load image

DoJ Arrests Libertarian for AI-Generated CP: "CSAM generated by AI is still CSAM"

Some of you might have seen me make a comment on here asking what would happen if someone used AI to generate photorealistic child pornography.

Well, unsurprisingly, that question has now found its way into the news:

https://arstechnica.com/tech-policy/2024/05/csam-generated-by-ai-is-still-csam-doj-says-after-rare-arrest/

>The US Department of Justice has started cracking down on the use of AI image generators to produce child sexual abuse materials (CSAM).

>On Monday, the DOJ arrested Steven Anderegg, a 42-year-old "extremely technologically savvy" Wisconsin man who allegedly used Stable Diffusion to create "thousands of realistic images of prepubescent minors," which were then distributed on Instagram and Telegram.

>The cops were tipped off to Anderegg's alleged activities after Instagram flagged direct messages that were sent on Anderegg's Instagram account to a 15-year-old boy. Instagram reported the messages to the National Center for Missing and Exploited Children (NCMEC), which subsequently alerted law enforcement.

>During the Instagram exchange, the DOJ found that Anderegg sent sexually explicit AI images of minors soon after the teen made his age known, alleging that "the only reasonable explanation for sending these images was to sexually entice the child."

>According to the DOJ's indictment, Anderegg is a software engineer with "professional experience working with AI." Because of his "special skill" in generative AI (GenAI), he was allegedly able to generate the CSAM using a version of Stable Diffusion, "along with a graphical user interface and special add-ons created by other Stable Diffusion users that specialized in producing genitalia."

>After Instagram reported Anderegg's messages to the minor, cops seized Anderegg's laptop and found "over 13,000 GenAI images, with hundreds—if not thousands—of these images depicting nude or semi-clothed prepubescent minors lasciviously displaying or touching their genitals" or "engaging in sexual intercourse with men."

>In his messages to the teen, Anderegg seemingly "boasted" about his skill in generating CSAM, the indictment said. The DOJ alleged that evidence from his laptop showed that Anderegg "used extremely specific and explicit prompts to create these images," including "specific 'negative' prompts—that is, prompts that direct the GenAI model on what not to include in generated content—to avoid creating images that depict adults." These go-to prompts were stored on his computer, the DOJ alleged.

>Anderegg is currently in federal custody and has been charged with production, distribution, and possession of AI-generated CSAM, as well as "transferring obscene material to a minor under the age of 16," the indictment said.

>Because the DOJ suspected that Anderegg intended to use the AI-generated CSAM to groom a minor, the DOJ is arguing that there are "no conditions of release" that could prevent him from posing a "significant danger" to his community while the court mulls his case. The DOJ warned the court that it's highly likely that any future contact with minors could go unnoticed, as Anderegg is seemingly tech-savvy enough to hide any future attempts to send minors AI-generated CSAM.

>"He studied computer science and has decades of experience in software engineering," the indictment said. "While computer monitoring may address the danger posed by less sophisticated offenders, the defendant's background provides ample reason to conclude that he could sidestep such restrictions if he decided to. And if he did, any reoffending conduct would likely go undetected."

>If convicted of all four counts, he could face "a total statutory maximum penalty of 70 years in prison and a mandatory minimum of five years in prison," the DOJ said. Partly because of "special skill in GenAI," the DOJ—which described its evidence against Anderegg as "strong"—suggested that they may recommend a sentencing range "as high as life imprisonment."

>Announcing Anderegg's arrest, Deputy Attorney General Lisa Monaco made it clear that creating AI-generated CSAM is illegal in the US.

>"Technology may change, but our commitment to protecting children will not," Monaco said. "The Justice Department will aggressively pursue those who produce and distribute child sexual abuse material—or CSAM—no matter how that material was created. Put simply, CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive, and increasingly photorealistic images of children."

Did I turn this into another op-ed on Medium? Of course I did:

https://medium.com/@MoonMetropolis/how-should-law-enforcement-handle-fake-ai-generated-child-pornography-2ceb8f1ded20

You can expect much more drama to flow from this in the coming years - it will almost certainly find its way to the US Supreme Court, and there is really no telling, at this point, how they will rule on it.

81
Jump in the discussion.

No email address required.

I hate using libertarian arguments to defend pedos but this guy should be getting charged for transferring obscene materials to a minor and nothing else. It's not real, no children were harmed in the making of it, etc. Where are the feds going to draw the line on this? Will AI-generating porn of people become legally considered involuntary pornography?

Jump in the discussion.

No email address required.

They are idiots kicking this hornets nest.

Courts decided years ago that photorealistic generated CP still qualifies as CP but DoJ have only been using it to blackmail MAPs in cases involving real CP because the statutes don't currently support it. They are risking giving grounds to appeal a whole bunch of negotiated plea agreements.

There really isn't a way for SCOTUS to carve out an exception here without criminalizing some episodes of South Park or turning the US in to an Australian style thought crime police state.

It would be good for dramacoin if cartoon pizza gets made illegal in the US but darn, how can you commit crimes against fake children. I'm real curious what they are going to come up with for justification for this one as think of the fake children being trafficked won't work here.

Jump in the discussion.

No email address required.

:#marseymugshot:

mfw owning evangelion on blu ray becomes a federal crime

Jump in the discussion.

No email address required.

it's a miscarriage of justice that it isn't already

Jump in the discussion.

No email address required.

:marseysad:

Jump in the discussion.

No email address required.

>tfw your crime gets dismissed after exclaiming that Misato is best girl

:#marseymisatolove::#marseymisatolove::#marseymisatolove:


https://i.postimg.cc/dVgyQgj2/image.png https://i.postimg.cc/d3Whbf0T/image.png

Jump in the discussion.

No email address required.

case dismissed because she's a legal adult or guilty as charged because of how she treats shinji? :marseyjudge:

Jump in the discussion.

No email address required.

She's a legal adult from my perspective :marseyindignant:

Her thing with Shinji is her legal issue :marseyshrug:


https://i.postimg.cc/dVgyQgj2/image.png https://i.postimg.cc/d3Whbf0T/image.png

Jump in the discussion.

No email address required.

She's a libertarian so she's first into the gulag

Jump in the discussion.

No email address required.

No more Rei bobs and vagene symbolism :marseyitsover:

Jump in the discussion.

No email address required.

It should be

Jump in the discussion.

No email address required.

Australian style thought crime police state.

Interesting choice here. Australia has one of the world's most expansive definitions of child pornography, to the point where a decent chunk of teenage girls' social media profiles technically contain child pornography. It's a weird situation where you're kind of hoping prosecutors won't abuse their power so as not to provoke the legislature to curtail that power, but there isn't really any guarantee.

Jump in the discussion.

No email address required.

>hoping prosecutors won't abuse their power

lol

Jump in the discussion.

No email address required.

Reported by:

Australia is a successful example of a conservative society.

:marseynotes:

Jump in the discussion.

No email address required.

If they go this route, it would open the door for people using AI to easily exploit others by implanting CP without the first user having to find existing CP

Jump in the discussion.

No email address required.

Wouldn't this make every bitcoin guy a criminal because there's CP on the blockchain? That'd be funny

Jump in the discussion.

No email address required.

Wait how is stable diffusion related to bitcoin?

Jump in the discussion.

No email address required.

Just cos if there's CP in the training material for SD but you can't access it then that seems roughly equivalent to the infamous CP in the blockchain for bitcoin, like it's implanted into the code

Jump in the discussion.

No email address required.

Courts decided years ago that photorealistic generated CP still qualifies as CP

Did you mean to say the opposite? That would be consistent with my understanding of SCOTUS jurisprudence on this topic:

https://www.nytimes.com/2002/04/16/national/supreme-court-strikes-down-ban-on-virtual-child-pornography.html

Jump in the discussion.

No email address required.

They deserve to be locked up for life for being attracted to children, frick everything else.

Jump in the discussion.

No email address required.

pass a constitutional amendment that creating/possessing synthetic CP is a capital crime.

Jump in the discussion.

No email address required.

SCOTUS?

more like

SCROTUM

Am I right hahahaha

Jump in the discussion.

No email address required.

:marseyclown2: "Yeah bro its just ANIME its not real kids dude! Bro shes 500yo od its just cartoons! omg! where will they draw the line!!"

Ban aicp jail aicp posters and while were at it ban loli and jail loli posters.

Jump in the discussion.

No email address required.

>Yeah bro its just ANIME its not real kids dude! Bro shes 500yo od its just cartoons! omg! where will they draw the line!!

unironically yes

Jump in the discussion.

No email address required.

This is one of those "technically not seeing fire but I'm certainly smelling smoke" moments.

Jump in the discussion.

No email address required.

it's not a real kid

Jump in the discussion.

No email address required.

!moidmoment saying the quiet :marseyfloch: part out loud, it is a kid but it's okay when he jerk off to it, they aren't real

Jump in the discussion.

No email address required.

Like blood diamonds?

Jump in the discussion.

No email address required.

More like lithium

Jump in the discussion.

No email address required.

Cuz like with blood diamonds there's the argument that even synthesized diamonds prop up the culture that causes blood diamonds in the first place

Why lithium?

Jump in the discussion.

No email address required.

Some neighbors just be hatin lithium fr 😔

Jump in the discussion.

No email address required.

disgusted i share a website with them

Jump in the discussion.

No email address required.

:marseyclown2: it's just an anime it's not real.

Jump in the discussion.

No email address required.

Yeah man, death to pedos, but also it's literally not even real. It's like loli enthusiasts - yes they belong in a woodchipper for being gross pedos, but the idea that picking up a pencil and drawing some degen anime shit is child abuse is absurd.

Jump in the discussion.

No email address required.

>Will AI-generating porn of people become legally considered involuntary pornography?

it should :starecat:

Jump in the discussion.

No email address required.

I WILL continue to make Oscar the Grouch fricking Taytay and the feds won't stop me :marseyindignant:

Jump in the discussion.

No email address required.

>i WILL continue to make Oscar the Grouch fricking Taytay and the feds won't stop me :marseyindignant:

I dont know why you're even bothering given she IS fricking Oscar the Grouch for realsies, and there's already actual photos of her out there showing the two of them together.

Jump in the discussion.

No email address required.

:marsey#hmmhips: sure they won't

Jump in the discussion.

No email address required.

They can try :#marseytoastygun:

https://i.rdrama.net/images/1706152871277142.webp

https://i.rdrama.net/images/17061541692894444.webp

https://i.rdrama.net/images/17061712244311087.webp

:#marseybiting: :#marseywould:

Jump in the discussion.

No email address required.

Love the muppets can you do it without the new one

Jump in the discussion.

No email address required.

Sure

https://media.giphy.com/media/khOqGPVTkbxzHNlvtT/giphy.webp

Jump in the discussion.

No email address required.

https://media.giphy.com/media/3orieRVW3KHu9jK3hS/giphy.webp

Jump in the discussion.

No email address required.

!spongebob deleted scene

Jump in the discussion.

No email address required.

he fricks her on top of the trash pile where she belongs

I seent it

Jump in the discussion.

No email address required.

Children were harmed in the making of it. The models these :marseypedo: use are trained on several thousand images of real CSAM, it takes IRL abuse to produce AI images. No different then someone like shadman taking real CSAM and redrawing it, which is also a crime.

Jump in the discussion.

No email address required.

it takes IRL abuse to produce AI images

Nah. If an AI model has naked adults and clothed children in its dataset it can put 2 and 2 together without help. At least, this is supposedly the reason why Stability purged nudity from their dataset, which caused their newer SD models to be useless for porn without heavy fine-tuning. It's :marseypedo: fault

Jump in the discussion.

No email address required.

The models these :marseypedo: use are trained on several thousand images of real CSAM, it takes IRL abuse to produce

oh come off it. That data set was over 5 BILLION images. A fraction of a percent were cp and most of those links were dead before the images that downloaded

Jump in the discussion.

No email address required.

hokay then just show us the training images and we'll string him up

Jump in the discussion.

No email address required.

>show us the training images

Nice try, libertarian!

Jump in the discussion.

No email address required.

:marseyaware: darn. Now I wonder if that makes the models themselves illegal, considering they have CP-data within them?

Jump in the discussion.

No email address required.

A machine learning model doesn't have the training data inside it, it just has parameters learned from the training data. There's no known way to reconstruct the training data from the model parameters.

Jump in the discussion.

No email address required.

The model still committed a thought crime by learning from the cp images

Jump in the discussion.

No email address required.

This will make a great :marseynice: supreme :marseyelliotrodger3: court :marseytakit: case in 8 years

black lives matter less

Jump in the discussion.

No email address required.

They'll have to take my interactive VR replica of Scarlett Johansson from my cold dead peepee.

Jump in the discussion.

No email address required.

If My Cold Dead Peepee isn't already a band name, it needs to be.

Jump in the discussion.

No email address required.

It already is a band, Scarlett plays the flute and I play the bongos.

Jump in the discussion.

No email address required.

kek

Jump in the discussion.

No email address required.

I should hit up Jeb and ask him to appoint me to the Court so I can be there for it.

Jump in the discussion.

No email address required.

hypothetically if I drew two stick figures fricking and then labeled one of them as being a "child", would that constitute CSAM according to the DOJ now?

Jump in the discussion.

No email address required.

Just don't draw CP?

Jump in the discussion.

No email address required.

why is this so hard for them?

Jump in the discussion.

No email address required.

I think photo realistic ai generated cp kind of relies on having pictures of real kids to go off of

Jump in the discussion.

No email address required.

"At first they came for the pedos making libertarian spank material, so I did speak up for them, for some godawful stupid reason"

Jump in the discussion.

No email address required.

Yeah. It's called having principles. Some of us actually believe in :marseyfreezepeach:, even when it benefits our enemies.

Jump in the discussion.

No email address required.

"Look bro, if you don't defend coomers making images for spanking off to 900 year old dragons that look like 8 year old girls, you hate free speech"

:#coomertalking:

Jump in the discussion.

No email address required.

I look like this and say this.

Jump in the discussion.

No email address required.

Frick off glowBIPOC

Jump in the discussion.

No email address required.

Likes 900 year old dragons... hates feds

:#marseynotesglow:

Jump in the discussion.

No email address required.

Jump in the discussion.

No email address required.

I think they're gross. But I don't think what they're doing should be illegal.

Jump in the discussion.

No email address required.

:marseywood#chipper2:

Jump in the discussion.

No email address required.

:marseyclapping: wow, so brave! Nobody has ever said this before!

Jump in the discussion.

No email address required.

Yeah I actually agree. CSAM is Child Sexual Abuse Material. If no children were abused in its creation - because it's not a real photograph - then it shouldn't qualify. I know that might make it more difficult for the feds to detect the "real stuff" but imo that's not even a real issue, like the detection algorithms don't have to be 100% accurate (they already aren't).

Jump in the discussion.

No email address required.

That seems like a good argument for banning - if the generated images are indistinguishable from real images, then prosecuting child abusers could become more difficult

Jump in the discussion.

No email address required.

The issue is, even with the classic "It's not REAL CP", photorealistic AI-generated CSAM images are pretty hard to distinguish from the real deal, to the point where, rather than set up ways to distinguish if a CP pic is authentic or not, it's more appealing to just ban them in first place

Jump in the discussion.

No email address required.

Not sure why the DOJ is going that route because US law doesn't consider animated / drawn depictions of child porn to actually be child porn. I'm guessing they want to kick the court and see if they can finally get a ruling in their favor.

Jump in the discussion.

No email address required.

I like how the doj is rizzing him up to be some libertarian creating literal god but all he did was prompt "naked child doing thing" using a lora.

Jump in the discussion.

No email address required.

They also say this in regards to keeping him in jail while the court decides wtf is going on...

as Anderegg is seemingly tech-savvy enough to hide any future attempts to send minors AI-generated CSAM

But this is a guy who was sending CP and trying to sext with 12 year olds over instagram. He's a fricking r-slur, basically.

Jump in the discussion.

No email address required.

:marseyshesrig#ht:

Jump in the discussion.

No email address required.

He used a NEGATIVE PROMPT

And extensions to generate better genitalia (!!!)

Jump in the discussion.

No email address required.

Partly because of "special skill in GenAI," the DOJ—which described its evidence against Anderegg as "strong"—suggested that they may recommend a sentencing range "as high as life imprisonment."

>Special skill in Gen AI

Was the neighbor making his own kid fricking LORAs?

Jump in the discussion.

No email address required.

We're still in the "new technology" phase of AI generation where news articles can use absurdly verbose and complicated ways of describing simple concepts.

Back in the late 90's you'd read articles about dudes getting arrested for running email fraud schemes, or in the way they put it "disseminating fraudulent materials in an entirely digital way over the digital connecting lines of the grand interconnected network of connected digital devices."

Jump in the discussion.

No email address required.

Nope, they already exist

Jump in the discussion.

No email address required.

:marseydespair: what a horrible thought

Jump in the discussion.

No email address required.

Who is Lora

Jump in the discussion.

No email address required.

my cousin, and stop lookign at her

Jump in the discussion.

No email address required.

If it looks like real kids you're getting charged

Jump in the discussion.

No email address required.

Simple :marseysmug4: as :#marseywoodchipper2:

Jump in the discussion.

No email address required.

never thought 911roofer would be the voice of reason

Jump in the discussion.

No email address required.

:marseywoodchipper2: the person is scum regardless but it will be an interesting case on the technology because ultimately it's as abusable as the person using it, would they also go to jail if they used a pencil?

Jump in the discussion.

No email address required.

Pleaaaase put shadman on death :marseycarpocide: row

Jump in the discussion.

No email address required.

https://i.rdrama.net/images/17163441111473017.webp

Jump in the discussion.

No email address required.

!coomers defend ur brown H addict king

Jump in the discussion.

No email address required.

I fapped to his horrible art when I was a kid :marseycoomer2: not Loli stuff though

Jump in the discussion.

No email address required.

Remember the dragon one?

Jump in the discussion.

No email address required.

Reported by:

nah he can go on death row, he wasn't just drawing fictional CP he was dealing in the real stuff too

Jump in the discussion.

No email address required.

When? The last I heard, he burned a few bridges during COVID and hasn't really done anything since then.

Jump in the discussion.

No email address required.

>loli fam is actual libertarian

:#suprisedasianwife:

Jump in the discussion.

No email address required.

I thought you meant sneedman and was so confused... unless

Jump in the discussion.

No email address required.

why not both

Jump in the discussion.

No email address required.

Iirc the rule was always that photorealistic child porn regardless of origin was illegal, based on the argument that otherwise they would have to dig up wherever every piece of child porn they charged someone for owning came from.

Jump in the discussion.

No email address required.

:marseythinkorino: also keeps pedos from putting real CP through a bunch of filters to claim it's not real

Jump in the discussion.

No email address required.

You can't draw cp that realistic. It just isn't humanly possible. Stable Diffusion can make images that are photorealistic. It takes no effort at all. You can even set your computer to generate 100 sets of 8 in like an hour.

Jump in the discussion.

No email address required.

wasn't there some woman in america who got jail time for writing kiddie smut fanfiction back in the 80s? like she had a website for it

Jump in the discussion.

No email address required.

Stephen King should get jail time then.

Jump in the discussion.

No email address required.

but it will be an interesting case on the technology

Nah, the feddies didn't even add that to the charges, t hey probably figure SCOTUS will rule like they did for Ashcroft v. American Civil Liberties Union

Jump in the discussion.

No email address required.

...

Jump in the discussion.

No email address required.


Link copied to clipboard
Action successful!
Error, please refresh the page and try again.