Unable to load image
Reported by:
  • collectijism : Oh no he banged his chatbot sister game of thrones bitty

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen

https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html

Archive link

Orange Site discussion

https://i.rdrama.net/images/1729699515792455.webp :!#marseysmug3:

61
Jump in the discussion.

No email address required.

GOOOD article.

Sewell's parents and friends had no idea he'd fallen for a chatbot. They just saw him get sucked deeper into his phone. Eventually, they noticed that he was isolating himself and pulling away from the real world. His grades started to suffer, and he began getting into trouble at school. He lost interest in the things that used to excite him, like Formula 1 racing or playing Fortnite with his friends. At night, he'd come home and go straight to his room, where he'd talk to Dany for hours.

Ignoring their child and all the red flags in the world, how could this possibly happen :marseypikachu2:

Jump in the discussion.

No email address required.

Kinda sounds like he was a depressed loser and the chatbot had nothing to do with it.

Jump in the discussion.

No email address required.

No he was the captain of the football team until one day an AI bot told him to kill himself and he did

Jump in the discussion.

No email address required.

>he was the captain of the football team until one day an AI bot told him to kill himself and he did

https://i.rdrama.net/images/17297151430926905.webp !r-slurs !chads !moidmoment !commenters

Jump in the discussion.

No email address required.

Killing yourself is badass

Jump in the discussion.

No email address required.

"I like how this AI bot has no idea who I am or what I've done. It's just another face"

Jump in the discussion.

No email address required.

:marseymanysuchcases:

He got his big brain from the internet and not his mother, she's at fault.

Jump in the discussion.

No email address required.

why'd his mum have to broadcast his roleplay to the world tho

Jump in the discussion.

No email address required.

It can't be her whose at fault, it must be the chatbot people owners.

I almost feel sorry her but not really. She had all the time in the world to cut his Internet, do some basic parenting shit like "supervise the child" or "cut amenities like phone time because his grades are down" and just didn't.

Jump in the discussion.

No email address required.

Her yeeting his phone was the triggering event lmao

https://i.rdrama.net/images/17297021833856747.webp

Tfw no more sexy time with Dany chatbot, why live?

:#marseycry:

Jump in the discussion.

No email address required.

:marseydisagree: Half-assed measure taken months after it should have been.

The article mentions he's still talking to the chatbot after this, it's like the parents never even tried to figure out the issue and set him adrift because he was too much work

Jump in the discussion.

No email address required.

If she did, she'd probably be the kind to do it in a hamfisted way and make the situation worse.

Jump in the discussion.

No email address required.

Even something as blunt as parental controls on how long the phone can be for bullshit things would have at least helped things. Even if it didn't fix the issue the monitors on app usage would have given them answers about what he was spending his time on.

Just about anything would have been better than just watching the kid destroy himself.

Jump in the discussion.

No email address required.

I'd kill myself a second time if my family just started talking about all the "suggestive" blonde anime girls in my phone to the news

Jump in the discussion.

No email address required.

I get not wanting to walk in on your kid jacking off but jeez take his phone away

Jump in the discussion.

No email address required.

Lewis Daynes but a robot

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.