Unable to load image
Reported by:
  • QueenOfSchqiperia : Users who make an article without first checking if it has been posted before should get chudded

[real] After being helpful with homework, Gemini randomly tells college student 'Please die', 'you are a stain on the universe,' :marseyaware:

https://www.cbsnews.com/news/google-ai-chatbot-threatening-message-human-please-die/

I thought Journos were making this up! :marseywtf:

but here's the chat log:

https://gemini.google.com/share/6d141b742a13

:#marseywtf2:

How did this happen?

Can chatbots get possessed by dead things that wander the earth? Did the guy secretly steering the AI pull a scary prank? :chadindianheadset: :marseyworried:

16
Jump in the discussion.

No email address required.

How did this happen?

Mentioned in the last couple :marsey2commies: :marseyrepostsign:s but my theory :marseyrdramahistorianschizo: is that it's from prompt padding

Behind the scenes the LLM is fed a bunch :marseysurejian: of crap like "You are an artificial :marseyhal: intelligence :marseysnappyenraged2: assistant :marseymaid5: do this do that don't be racist" I think :marseymischevious: it got tripped up by this (Google probably implemented it poorly like everything else relating to Gemini) and thought :marseymindblown: it was supposed to act like a malevolent sci-fi AI :marseysnappyenraged2:

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.