Unable to load image

The content filter on Bing's image generator actually pisses me off

My friend is having a spat with his husband over cheese right now so I wanted to generate a picture of the situation. The prompt "man eating cheese while his partner looks on angrily" worked first try with a photo of a woman yelling at a man while he was enjoying cheese. I changed "partner" to husband and was blocked by the filter, calling the prompt "unsafe". I tried a bunch of different variations but all were deemed "unsafe". I simplified it down to "man angry at husband" and was still stonewalled. Every attempt at getting a photo of a man expressing negative emotions at his partner was blocked. Even the most basic prompt "Man angry at husband" did not work. "male friend" fine. "another man", fine. But "boyfriend" and "husband" are all trigger words for the AI. I'm actually offended that the LGBTQIA++ inspired hand-holdy bullshit they put on the AI has wrapped around to being homophobic.

Here's as close as I got.

https://i.rdrama.net/images/16967379164465492.webp

Tl;dr: Kill AI jannies, behead AI jannies, round-house kick AI jannies into the dirt.

BTW, if you get this stupid pufferfish telling you there's high demand it's bullshit. You just asked for something that the algo doesn't like but not egregious enough to give you the warning.

https://i.rdrama.net/images/1696740831010587.webp

!lgbt !gayporn !bottoms !reportmaxxers !pinknames


:!marseybooba:

81
Jump in the discussion.

No email address required.

"male roommate angry at man eating cheese" would probably work.

I know.... :marseygiveup:

They've delegated gay couples to "roommates." Fricking fascists! :marseymad:

Jump in the discussion.

No email address required.

They were roommates!!!!

https://i.rdrama.net/images/1696739111115541.webp


:!marseybooba:

Jump in the discussion.

No email address required.

They were roommates, yet they denied the possibility that they were an angrily married gay couple!

:marseyraging: :marseyraging: :marseyraging:

Jump in the discussion.

No email address required.

It matches the races every time?

Jump in the discussion.

No email address required.

It's instructed to randomize the races if not specified.


:!marseybooba:

Jump in the discussion.

No email address required.

It can even blackwash celebs.

I tried to make tradwife belle delphine (for research purposes :marseyveryworried: ) and 2 out of 3 pics are either sexy Indian dude or black.

I think it's purposefully randomizing race.

Jump in the discussion.

No email address required.

But the two people in the picture have the same race every time though I mean.

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.