Of people
Ofcom has launched an investigation into Elon Musk’s X over concerns its AI tool Grok is being used to create sexualised images.
In a statement, the UK watchdog said there had been “deeply concerning reports” of the chatbot being used to create and share undressed images of people, as well as “sexualised images of children”.
Note the careful avoidance of the word “women”. Note how unlikely it is that all this “undressing” is equally distributed between women and men. Note the BBC’s staunch determination to erase women even from stories that affect them far more than they affect men.
The BBC has seen several examples of digitally altered images on X, in which women were undressed and put in sexual positions without their consent. One woman said more than 100 sexualised images have been created of her.
Finally – but that’s the 6th paragraph. The word “women” should have appeared at the top.
An Ofcom spokesperson did not give an indication on how long the investigation would take but said it would be a “matter of the highest priority”.
“Platforms must protect people in the UK from content that’s illegal in the UK,” they said. “We won’t hesitate to investigate where we suspect companies are failing in their duties, especially where there’s a risk of harm to children.”
But not women. Never ever ever women.
