Background: Dalle2 "improved" the diversity of their prompts by silently adding diversity keywords
tldr: it's the diversity stuff. Switch "cowboy" to "cowgirl", which would disable the diversity stuff because it's now explicitly asking for a 'girl', and OP's prompt works perfectly.
And turns out that, like here, if we mess around with trying to trigger or disable the diversity stuff, we can get out fine samples; the trigger word appears to be... 'basketball'! If 'basketball' is in the prompt and no identity-related keywords like 'black' are, then the full diversity filter will be applied and will destroy the results. I have no idea why 'basketball' would be a key term here, but perhaps basketball's just so strongly associated with African-Americans that it got included somehow, such as a CLIP embedding distance?
Jump in the discussion.
No email address required.
Glad to see that they didn't resolve this (at least not totally) after all. What a clusterfrick.
Jump in the discussion.
No email address required.
they didnt resolve it, they went even harder with the latest release and doubled down on the filtering. which is why it's so aggressive now
Jump in the discussion.
No email address required.
More options
Context
I really thought the sign controversy would mean they would revert it immediately.
Jump in the discussion.
No email address required.
More options
Context
More options
Context