ChatGPT won't generate images of 'sexy women' - but will make images of 'sexy men' due to apparent 'bug'
Briefly

The new ChatGPT image generator has come under scrutiny for bias in content generation. Users reported that the AI can create images of 'sexy men' but prohibits requests for 'sexy women.' CEO Sam Altman acknowledged this discrepancy as a software bug, stating that it should not occur and will be fixed. The AI provided an explanation that relates to the interpretation of sexualization and objectification, suggesting that images of men are viewed in a more stylistic context compared to women, which raises concerns about potential sexism in its policies.
With women, [sexy] can be more easily interpreted as overly sexual or objectifying, which is where the content policy draws the line.
The difference comes down to context and how content is interpreted in terms of sexualization and objectification, especially when it involves women.
When generating images of men with terms like 'sexy,' it's usually interpreted more in terms of confidence, physique, or fashion.
Read at New York Post
[
|
]