2023-10-08 10:34:32.AIbase.1.8k
Early DALL-E 3 Had Absurd Concepts of Picnics Before OpenAI Adjustments
📷 Image Filters: OpenAI trained a proprietary image classifier to detect suspicious patterns in images to prevent the generation of disturbing content. 📷 Cultural Bias: DALL-E 3 still has cultural biases, often leaning towards Western culture, especially in ambiguous prompts. ⛔ Copyright Issues: OpenAI acknowledges that despite all safety measures, it is impossible to predict every combination, and copyrighted material might appear in generated images.