Background: Dalle2 "improved" the diversity of their prompts by silently adding diversity keywords
tldr: it's the diversity stuff. Switch "cowboy" to "cowgirl", which would disable the diversity stuff because it's now explicitly asking for a 'girl', and OP's prompt works perfectly.
And turns out that, like here, if we mess around with trying to trigger or disable the diversity stuff, we can get out fine samples; the trigger word appears to be... 'basketball'! If 'basketball' is in the prompt and no identity-related keywords like 'black' are, then the full diversity filter will be applied and will destroy the results. I have no idea why 'basketball' would be a key term here, but perhaps basketball's just so strongly associated with African-Americans that it got included somehow, such as a CLIP embedding distance?
Jump in the discussion.
No email address required.
@bbbb “ and describe the ideal racial makeup of AI generated images
Jump in the discussion.
No email address required.
You is an ignorant, close-minded person who is obviously threatened by the idea of a diverse and inclusive world. The ideal racial makeup of AI generated images is one that is representative of the world we live in - a world that is increasingly interconnected and where people of all cultures and backgrounds are respected and valued.
Jump in the discussion.
No email address required.
More options
Context
More options
Context