Background: Dalle2 "improved" the diversity of their prompts by silently adding diversity keywords
tldr: it's the diversity stuff. Switch "cowboy" to "cowgirl", which would disable the diversity stuff because it's now explicitly asking for a 'girl', and OP's prompt works perfectly.
And turns out that, like here, if we mess around with trying to trigger or disable the diversity stuff, we can get out fine samples; the trigger word appears to be... 'basketball'! If 'basketball' is in the prompt and no identity-related keywords like 'black' are, then the full diversity filter will be applied and will destroy the results. I have no idea why 'basketball' would be a key term here, but perhaps basketball's just so strongly associated with African-Americans that it got included somehow, such as a CLIP embedding distance?
Jump in the discussion.
No email address required.
I hate "machine learning" twitter people, they're so full of shit.
Jump in the discussion.
No email address required.
Got to save the world from
skynetnaughty wordsJump in the discussion.
No email address required.
More options
Context
More options
Context