Artificial Intelligence can be used for anything – writing your papers, texting your friends and even generating nonconsensual boudoir photoshoots!
Elon Musk, the owner of both Tesla and SpaceX, acquired the social media app Twitter, which he later renamed X, in 2022 for $44 billion, according to The New York Times. Shortly after buying out Twitter, Musk claimed that addressing the problems that the app faced with child sexual abuse material was the number one priority, according to NBC.
Musk, known for owning business after business, launched his artificial intelligence chatbot Grok in 2023, which has been updated to newer versions since, according to Business Insider. The newest version of Grok can be tagged in posts across X, where people can ask questions and generate images and videos in their replies.
The problem with this? Grok will generate anything. According to The New York Times, Grok generated and shared at least 1.8 million sexualized photos of women.
People all over the world used the free-functioning chatbot on X to undress women and put them into sexualized poses, according to The Guardian. After the outcry that came from the photos flooding people’s feeds, Musk changed the feature to only be accessible to subscribers. Researcher at AI Forensics Paul Bouchaud told The Guardian that the videos Grok created were “fully pornographic” and “look professional.”
AI is just another way women can be sexualized online. Unconsensual pornography is getting increasingly more available online, especially when you can make it yourself with the touch of a button. According to a study done by UN Women, 57% of women across 51 countries have experienced video or image-based abuse online, where private content was shared with malicious intent. This only gets worse with AI, where fake photos can be created and then shared across the internet of unwilling women.
As AI becomes more trained and accessible, will women be pushed back? Will misogynistic norms become regular? Will women face fear, knowing that every day they might get online to see themselves generated into a pose they never made while in clothes they never wore?

