Background: Dalle2 "improved" the diversity of their prompts by silently adding diversity keywords
tldr: it's the diversity stuff. Switch "cowboy" to "cowgirl", which would disable the diversity stuff because it's now explicitly asking for a 'girl', and OP's prompt works perfectly.
And turns out that, like here, if we mess around with trying to trigger or disable the diversity stuff, we can get out fine samples; the trigger word appears to be... 'basketball'! If 'basketball' is in the prompt and no identity-related keywords like 'black' are, then the full diversity filter will be applied and will destroy the results. I have no idea why 'basketball' would be a key term here, but perhaps basketball's just so strongly associated with African-Americans that it got included somehow, such as a CLIP embedding distance?
Jump in the discussion.
No email address required.
Someday I wanna write a program where the error message is just random pictures of hot asian girls.
Jump in the discussion.
No email address required.
Being able to code is largely about being able to read error messages. So you would need a really impressively large set of hot Asian girl pictures and then you'd need some kind of index that says what each one means...
Jump in the discussion.
No email address required.
More options
Context
How about you go learn to code instead of thinking about hot asian girls all day?
Jump in the discussion.
No email address required.
Will you help me
Jump in the discussion.
No email address required.
No, I won't help you.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Pretty coherent tbh
Jump in the discussion.
No email address required.
Your writing is far from coherent. In fact, it's pretty incoherent.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
Ah shit, a slutty Asian Pic again, must be an internal server error.
Jump in the discussion.
No email address required.
More options
Context
More options
Context