Background: Dalle2 "improved" the diversity of their prompts by silently adding diversity keywords
tldr: it's the diversity stuff. Switch "cowboy" to "cowgirl", which would disable the diversity stuff because it's now explicitly asking for a 'girl', and OP's prompt works perfectly.
And turns out that, like here, if we mess around with trying to trigger or disable the diversity stuff, we can get out fine samples; the trigger word appears to be... 'basketball'! If 'basketball' is in the prompt and no identity-related keywords like 'black' are, then the full diversity filter will be applied and will destroy the results. I have no idea why 'basketball' would be a key term here, but perhaps basketball's just so strongly associated with African-Americans that it got included somehow, such as a CLIP embedding distance?
Jump in the discussion.
No email address required.
“COMPUTER! Give me an image of a book”
“You mean two black dudes buttfricking each other”
“No. Let me rephrase. How about a cover of Harry Potter”
“You mean a black man jacking off in a wizard hat?”
“How about a picture of an apple?
“You mean a black man ejaculating into another black man?”
Jump in the discussion.
No email address required.
More options
Context