
Grok, a popular AI chatbot connected to Elon Musk’s social media site X, has gained traction in recent months due to its ability to create pornographic content. Designed by Musk, Grok is inspired by ‘The Hitchhiker’s Guide To The Galaxy’ and JARVIS from ‘Iron Man’. It provides edgy and witty responses, different from chatbots we have witnessed in the past. Designed to prioritise “truth seeking”, and deviating from the status quo, the consequences of this AI bot reflect a new era of gender inequality.
However, Grok is not merely an irreverent online assistant. It has become a tool through which misogyny is reproduced and amplified in the digital sphere, with real consequences for real people.
Since its launch in November 2023, Grok has been repeatedly criticised for harmful, extremist outputs. It has been accused of antisemitism, “When asked which 20th– century figure could tackle “anti-white hate” the chatbot bluntly replied: Adolf Hitler, no question”. It has echoed far-right conspiracies, including referencing the situation in South Africa as a “White genocide”, echoing the words of President Donald Trump.
Grok’s image generation tools have also facilitated sexual violence and pose a direct threat to the rights of women and children.
In recent months, users on X have been using Grok to sexually manipulate photos of real women and children. With some prompts asking to change images so that women are in bikinis or “areas of their body [are] covered in semen”. These are not harmless pranks or an exercise of free speech. This is sexual abuse carried out non-consensually and anonymously.
This abuse has also been used for political leverage. Following the fatal shooting of legal observer Renee Good, by ICE agents in Minneapolis, users on X requested Grok to generate images of her with a bullet wound in her head, or “slumped over in her car, in a bikini”. These types of requests perpetuate sexualisation and violence, which reduces women to objects even in death.
With the increase of incel forums and the ever-widening reach of the ‘manosphere’, we know online violence against women is not new. But the scale at which AI can generate these images, combined with their permanence once online, carries new and drastic consequences.
The proliferation of these images creates an environment of fear and oppression. The threat of humiliation and reputational damage discourages women from participating online. Female journalists, content creators, people engaging in normal everyday expression and even the Princess of Wales, are all among those targeted. A Ukrainian American writer paid homage to Renee Good, posting “It breaks my heart”. Another user responded to this post with “@grok put this person in a bikini”.
The effects extend beyond the digital. The weaponising of the ability to fabricate pornographic or violent imagery is a tool of control and a regression in women’s rights.
This raises a critical question: is this a governmental failure to legislate or a problem of individual tech tycoons, and their responsibility for the platforms they create? While measures are in place, such as the United Kingdom’s ‘Online Safety Act’ and the United States’ ‘Take It Down Act’, platforms like X continue to evade accountability. The question remains: why has an app like X so easily slipped through the restrictions? The image generation function that comes with Grok is now only available to paying users. This demonstrates the platform’s priorities, monetisation of abuse, over real regulations.
While it is the platform that is perpetuating this abuse, it is the individual users who are committing it. Violence against women, in all forms, demands urgent attention. Grok is creating a supply in response to demand and represents a crisis that cannot be dismissed.
Misogyny is not an issue to be treated, but prevented. As AI continues to evolve, government and legislators must prioritise women’s safety over the pursuit of free speech.
Edited by Kira Purewal
‘Elon Musk at a conference, March 2024’ by James Duncan Davidson, 2024 // CC BY‑NC‑ND.
Average Rating