Wow your imagination really took off there, because that… is not what I said.
The story mentions AI to make this sound like something new and specific to AI. Whereas both heavy editing of portraits/videos, and Eurocentric beauty standards, are basically as old as photography itself.
I’m fine with criticizing the industry for racist stereotypes; can’t we also criticize it for perpetuating the myths of unrealistic perfect bodies and complexions?
Perhaps the designer had no clue the face had been changed. Seems more like he doesn’t want to admit that he didn’t realize the face had been altered. They hold tons of these events and there are so many models wearing many different outfits. Maybe he just recognized the dress and shared.
This is a great example of how people are blind to their own bigotry (I guarantee they saw nothing wrong with this, which is why they proudly posted it), and why titles like "racist" aren't self imposed, but given by others who can clearly see and/or are impacted directly by the bigot's actions.
Air brushing, Photoshop, and AI are specific methodologies, you can’t just use that as a generic term when that will just create more confusion and hide the meaning, especially when the difference is important due to legislation.
I never put out an article saying it was one of them definitively. I just said the style looked similar to another method.
“AI” is not a term for photo editing at all, much less a generic term for it. Using totally wrong words to refer to things doesn’t make you innovative, it makes you a crackpot or a liar.
This is seriously the dumbest take I’ve seen all week and I can’t believe how many people upvoted it.
Because the headline is about what the model said, and the model said that it was AI. Neither the headline nor the article says that she’s right about it, just that her face was altered dramatically, which is absolutely relevant when you consider some of the most widespread uses for visual AI right now. This might have just been photoshop, but it looks a lot like some of the AI-powered TikTok filters, so it’s worth a conversation as to why we feel the need to do this to people’s faces at all.
Because the article specifically said “AI”. It could have just said “edited” and left it at that is the methodology was unknown. But when the methodology is both unknown and doesn’t affect the story at all, it’s bullshit to put a potential lie in the headline just to act relevant to a different hot topic issue.
Add comment