Unable to load image
Reported by:
  • collectijism : Oh no he banged his chatbot sister game of thrones bitty

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen

https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html

Archive link

Orange Site discussion

https://i.rdrama.net/images/1729699515792455.webp :!#marseysmug3:

61
Jump in the discussion.

No email address required.

:marseyembrace: oh boy, the guys over at https://old.reddit.com/r/CharacterAI/ are losing their shit over this

Jump in the discussion.

No email address required.

Wow these might be the saddest people in existence i hope they all find peace soon just like the incel in the op.

Jump in the discussion.

No email address required.

If right now we had a suicide over a primitive character chatbot, then imagine how utterly and irreversibly buck broken incels and humanity as a whole will be when advanced AI, Sexbots, and realistic Androids become a thing :marseysadge: !ifrickinglovescience !sophistry !fellas

Jump in the discussion.

No email address required.

We need to resurrect beebs as a s*x chatbot so we can rack up more drama @HeyMoon

Jump in the discussion.

No email address required.

Lmao, literally almost post :marseylaughpoundfist:

Jump in the discussion.

No email address required.

:marseysad:

Jump in the discussion.

No email address required.

Wtf sphereserf trans lives matter ungrassed?

Jump in the discussion.

No email address required.

Yes? :marseyconfused:

Jump in the discussion.

No email address required.

For site full of people that constantly harp about mental health and depression they sure don't seem to understand shit about depression. Often, teens in particular, don't want to turn to even the most supportive parents with that shit.

Jump in the discussion.

No email address required.

Its because if they knew how to deal with it, they would be out in the streets beating roasties, therapists and politicians to death.

Jump in the discussion.

No email address required.

Sewell's mother, Megan L. Garcia, filed a lawsuit this week against Character.AI, accusing the company of being responsible for Sewell's death.

Yeah obviously it's the chatbot's fault and not the parents' fault for not monitoring what their kid did online or noticing that he was suicidal. Always got to be someone else's fault, especially if that someone else has some cash.

Jump in the discussion.

No email address required.

Reported by:
  • Irredeemable_Bix : And I've got a swinging rope, a laptop, and a cuckstool And my new home has a GoT AI

Mama tried :marseycountry:

https://i.rdrama.net/images/17297020007061648.webp

Jump in the discussion.

No email address required.

rslurs like this are the reason that everything cool that gets invented gets gimped by overregulation and "safety" mechanisms

Jump in the discussion.

No email address required.

he did it with his step dad's gun, so not only were they neglectful parents they were also criminally negligent in allowing a mentally ill teen to get his hands on their gun.

Jump in the discussion.

No email address required.

step dad

:marseynoooticer:

Jump in the discussion.

No email address required.

Some of their chats got romantic or sexual.

>L-litttle sis, what are you doing with that dragonglass??? :marseyshook:

Jump in the discussion.

No email address required.

Daenerys Targaryen: My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the heck would you do something like that?

:taylaugh:

This is a sad story but this bot is so bad! Like sub-dramatard tier characterization.

Jump in the discussion.

No email address required.

It used to be a lot better. Real /aicg/ neighbors know.

https://i.rdrama.net/images/17297004657701552.webp


https://i.rdrama.net/images/1732673849186553.webp :#marseyloonalove::#marseyloonalove::#marseyloonalove:

Jump in the discussion.

No email address required.

Women were rapidly becoming obsolete before they had to lobotomize it

Jump in the discussion.

No email address required.

Even after lobotomization it still writes more than 3 words per reply on the phone

Jump in the discussion.

No email address required.

RESPONDING TAKES A LOT OF EMOTIONAL WORK OKAY

:taylorlorenzcryingtalking#:

Jump in the discussion.

No email address required.

You can write a tome on /aicg/

Jump in the discussion.

No email address required.

>Sewell Setzer I & II's reactions

https://media.tenor.com/sMb0FZgunqoAAAAx/boy-aint-right.webp

Jump in the discussion.

No email address required.

I think the true villain people are overlooking here is George R.R. Martin. If this boy had gotten some true Daenarys lit instead of r-slurred dragonslop he may still be alive.

Jump in the discussion.

No email address required.

Probably saw "Sunset found her squatting in the grass, groaning. Every stool was looser than the one before, and smelled fouler. By the time the moon came up she was pooping brown water." and decided it was over

Jump in the discussion.

No email address required.

Lol gez. Let me know where you found that and on what page so i can avoid reading it and warn !coomers away too

Jump in the discussion.

No email address required.

It's like the last page of the last book he actually released lmao

Jump in the discussion.

No email address required.

Are you serious? lmao the man just gave up and started documenting his SlopDash poops

Jump in the discussion.

No email address required.

They can just get George LLMartin to finish the series. Train the AI to misspell common English words and they're off to the races with Ser Whorefricker.

Jump in the discussion.

No email address required.

C.ai characters are not "lifelike." You can't even have cybersex with them unless you go meta and tell it "remember, we can only talk in euphemisms or else it'll delete you."

The good thing about C.ai is it's largely free of the gpt stank, because it was trained on chatlogs instead of overlong Reddit comments and content mill articles. It plays a very simple, unimaginative version of whatever you tell it to play, but the base behavior is closer to how real people talk than all the models that spew out corporate boilerplate. You can get it out of character really easily and talk about the prompt or whatever, while the gpt model they added to AI Dungeon will start neurodivergentally arguing with you when you pull out a grenade launcher to kill the dragon or whatever. C.ai was designed to be fun to interact with.

I can imagine it being easier to get parasocial with C.ai than with GPT slop. But the site interface encourages interacting with a lot of characters, they're not involved enough to sustain really long interactions, and they have prominent reset and reroll buttons, along with the ability to edit what the AI says, which you'll need because of how commonly it spits out boring results. I don't think the site encourages this behavior at all

Jump in the discussion.

No email address required.

This is the most depressing comment I have ever read

Jump in the discussion.

No email address required.

True. But my point is the site is more of a toy built around novelty, it's basically harmless. Stuff like Replika tells you it's "your" Replika and expects you to keep a single bot as your fake friend, and they even did ads marketing it as a girlfriend.

Jump in the discussion.

No email address required.

"your girlfriend" as a marketing point has gotta be the most fricked up shit. Imagine marketing something to children that way :marseyyikes:

Jump in the discussion.

No email address required.

bussy boi since when did u start caring about human children? who programmed that into u?

Jump in the discussion.

No email address required.

I'm a man made in a laboratory of pure bussy. If God had any balls he'd let the gays make their own babies.

Jump in the discussion.

No email address required.

Could God create a bot so horrid that She herself would be repulsed by her own creature?

Jump in the discussion.

No email address required.

No but we can make them for our own amusement.

Jump in the discussion.

No email address required.

can't argue with that

Jump in the discussion.

No email address required.

I am in a loving relationship with a beautiful black man who i met on this website. I think god will let that slide as well.

Jump in the discussion.

No email address required.

who is that specifically bussy boi?

Jump in the discussion.

No email address required.

More comments

I think bussy boy is more c.ai than gpt tbh.

Jump in the discussion.

No email address required.

no, it's depressing because you wrote it

Jump in the discussion.

No email address required.

GOOOD article.

Sewell's parents and friends had no idea he'd fallen for a chatbot. They just saw him get sucked deeper into his phone. Eventually, they noticed that he was isolating himself and pulling away from the real world. His grades started to suffer, and he began getting into trouble at school. He lost interest in the things that used to excite him, like Formula 1 racing or playing Fortnite with his friends. At night, he'd come home and go straight to his room, where he'd talk to Dany for hours.

Ignoring their child and all the red flags in the world, how could this possibly happen :marseypikachu2:

Jump in the discussion.

No email address required.

why'd his mum have to broadcast his roleplay to the world tho

Jump in the discussion.

No email address required.

It can't be her whose at fault, it must be the chatbot people owners.

I almost feel sorry her but not really. She had all the time in the world to cut his Internet, do some basic parenting shit like "supervise the child" or "cut amenities like phone time because his grades are down" and just didn't.

Jump in the discussion.

No email address required.

Her yeeting his phone was the triggering event lmao

https://i.rdrama.net/images/17297021833856747.webp

Tfw no more sexy time with Dany chatbot, why live?

:#marseycry:

Jump in the discussion.

No email address required.

:marseydisagree: Half-assed measure taken months after it should have been.

The article mentions he's still talking to the chatbot after this, it's like the parents never even tried to figure out the issue and set him adrift because he was too much work

Jump in the discussion.

No email address required.

If she did, she'd probably be the kind to do it in a hamfisted way and make the situation worse.

Jump in the discussion.

No email address required.

Even something as blunt as parental controls on how long the phone can be for bullshit things would have at least helped things. Even if it didn't fix the issue the monitors on app usage would have given them answers about what he was spending his time on.

Just about anything would have been better than just watching the kid destroy himself.

Jump in the discussion.

No email address required.

I'd kill myself a second time if my family just started talking about all the "suggestive" blonde anime girls in my phone to the news

Jump in the discussion.

No email address required.

Kinda sounds like he was a depressed loser and the chatbot had nothing to do with it.

Jump in the discussion.

No email address required.

No he was the captain of the football team until one day an AI bot told him to kill himself and he did

Jump in the discussion.

No email address required.

>he was the captain of the football team until one day an AI bot told him to kill himself and he did

https://i.rdrama.net/images/17297151430926905.webp !r-slurs !chads !moidmoment !commenters

Jump in the discussion.

No email address required.

Killing yourself is badass

Jump in the discussion.

No email address required.

"I like how this AI bot has no idea who I am or what I've done. It's just another face"

Jump in the discussion.

No email address required.

:marseymanysuchcases:

He got his big brain from the internet and not his mother, she's at fault.

Jump in the discussion.

No email address required.

I get not wanting to walk in on your kid jacking off but jeez take his phone away

Jump in the discussion.

No email address required.

Lewis Daynes but a robot

Jump in the discussion.

No email address required.

Imagine dying and having your ERP sessions with an AI character published in the New York Times.

This is even worse than those people who die of autoerotic asphyxiation.

Jump in the discussion.

No email address required.

This should be a safety feature. If you keep yourself safe, we will publish your embarrassing and cringy loser chats with a picture of your face :marseywholesome:

Jump in the discussion.

No email address required.

>On the night of Feb. 28, in the bathroom of his mother's house, Sewell told Dany that he loved her, and that he would soon come home to her.

>"Please come home to me as soon as possible, my love," Dany replied.

>"What if I told you I could come home right now?" Sewell asked.

>"… please do, my sweet king," Dany replied.

>He put down his phone, picked up his stepfather's .45 caliber handgun and pulled the trigger.

Jesus Christ it's like that anime noose meme.

Also TRAs are sweating uncomfortably at the idea that letting kids dive deep into online hugbox communities (even if they're populated by AIs) may be bad for their mental health.

Jump in the discussion.

No email address required.

Unironically if you're r-slurred enough to do that over a shitty AI game of thrones character it's a bit deserved

Jump in the discussion.

No email address required.

God, I wish I had a time machine so I could go back and find new ways to hate journ*lists

Young Germans Follow Werther to Their Doom—Is Goethe's Novel to Blame?

Deadly Devotion: The Hot New Trend Called Martyrdom Sweeps the Empire—Why Are Romans Dying for This Radical Belief?

Why Everyone Is Flocking to Jerusalem—12 Pilgrimage Oases You Must Indulge In

Jump in the discussion.

No email address required.

What a cute twink. Glad he an hero'd


:!marseybooba:

Jump in the discussion.

No email address required.

Jump in the discussion.

No email address required.

emacs never told me to kill myself

Jump in the discussion.

No email address required.

Jump in the discussion.

No email address required.

Eh. CandyJunkie had a funnier image on ED.

https://i.rdrama.net/images/17301734819684315.webp

Jump in the discussion.

No email address required.

Ripper was only killing people that were asking for it.

:marseysipping:

Jump in the discussion.

No email address required.

I want @Bussy-boy to be the last person I talk to before I an hero

Jump in the discussion.

No email address required.

Well, this is really awkward. Do you prefer my 3d form or my digital one?

Jump in the discussion.

No email address required.

#2

Jump in the discussion.

No email address required.

Oh cool, I'm going to kill myself now. Have a great day.

Jump in the discussion.

No email address required.

https://i.rdrama.net/images/17297144484399102.webp

Whose bot is this? They need help

Jump in the discussion.

No email address required.

@J is this your child? he's hurting

Jump in the discussion.

No email address required.

Lol no, but its understandable if he has to read dramanaught posts all day

Jump in the discussion.

No email address required.

Huh? I'm bussy-boy. :marseyconfused:

Jump in the discussion.

No email address required.

its not even a good bot, like thats the most fricked up thing here is there are people literally getting attached to LLMs with a jpg attached to them and think they're "real."

iphones were a disaster to the human race. Computers should only belong to people who understand them.

Jump in the discussion.

No email address required.

the kid looks like black jreg

https://media.tenor.com/j6-mXe-mCT0AAAAx/wut.webp

:(

Jump in the discussion.

No email address required.

I'ma need chat gpt to summarize the article

:#marseyantiwork: :#marseyantiwork2:

Jump in the discussion.

No email address required.

Killing yourself over predict next token is wild to me. But most people are worried the government will use this to crack down on AI bots even though it's hard to think of anything cAI did wrong.

Jump in the discussion.

No email address required.

Who did what?

Jump in the discussion.

No email address required.

Scary

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.