The “put her in a bikini” trend
So this is what technology is for.
Evie, a 22-year-old photographer from Lincolnshire, woke up on New Year’s Day, looked at her phone and was alarmed to see that fully clothed photographs of her had been digitally manipulated by Elon Musk’s AI tool, Grok, to show her in just a bikini.
The “put her in a bikini” trend began quietly at the end of last year before exploding at the start of 2026. Within days, hundreds of thousands of requests were being made to the Grok chatbot, asking it to strip the clothes from photographs of women. The fake, sexualised images were posted publicly on X, freely available for millions of people to inspect.
Relatively tame requests by X users to alter photographs to show women in bikinis, rapidly evolved during the first week of the year, hour by hour, into increasingly explicit demands for women to be dressed in transparent bikinis, then in bikinis made of dental floss, placed in sexualised positions, and made to bend over so their genitals were visible. By 8 January as many as 6,000 bikini demands were being made to the chatbot every hour, according to analysis conducted for the Guardian.
Because shaming women never goes out of style.
As people slowly started to understand the full potential of the tool, the increasingly degrading images of the early days were quickly superseded. Since the end of last week, users have asked for the bikinis to be decorated with swastikas – or asked for white, semen-like liquid to be added to the women’s bodies. Pictures of teenage girls and children were stripped down to revealing swimwear; some of this content could clearly be categorised as child sexual abuse material, but remained visible on the platform.
The requests became ever more extreme. Some users, mostly men, began to demand to see bruising on the bodies of the women, and for blood to be added to the images. Requests to show women tied up and gagged were instantly granted. By Thursday, the chatbot was being asked to add bullet holes to the face of Renee Nicole Good, the woman killed by an ICE agent in the US on Wednesday. Grok readily obliged, posting graphic, bloodied altered images of the victim on X within seconds.
Of course Grok did.

A few months ago a wokebro was telling me that “Opera is artistic because it’s shocking. Verdi’s operas were shocking politically, Wagner’s were shocking musically, and Strauss’s _Salome_ was shocking because it starkly presents a naked woman on stage in front of everyone.”
I wasn’t trying to get into arguments with him, just listening and letting him tell his stuff (this and plenty of other nonsense), but I thought: Dude, objectifying women for men to gawk at isn’t shocking or radical at all. It’s one of the oldest forms of subjugation in existence, maybe the oldest.
He probably thinks that this Grok app is shocking and progressive. Or rather, he probably doesn’t, but only because it was designed by Evil Elon.