Daniel ThomasSenior tech reporter

Getty Images
An advert for a video and image editing tool that implied viewers could digitally remove a woman's clothing has been banned by the UK advertising regulator.
The YouTube ad for PixVideo - AI Video Maker, seen in January, showed a "before" and "after" image of a young women, with red scribble overlaid on her midriff in the former, and parts of her bare skin exposed in the latter.
Text across the bottom of the picture stated: "Erase anything" followed by a heart-eyes emoji.
Eight people complained to the Advertising Standards Authority (ASA) that the ad sexualised and objectified women, and was irresponsible, offensive and harmful.
It is not clear whether the image in the ad is of a real person or is itself AI-generated, with the ASA telling the BBC making such an assessment had not been part of its investigation.
The regulator said that PixVideo did not permit its users to remove clothing from digital images to create sexually explicit content, but that viewers may have got the impression that it did.
"Because the ad implied that viewers could use an app to remove a woman's clothing, we considered it condoned digitally altering and exposing women's bodies without their consent," the agency said in a statement.
It added that the ad was "irresponsible, included a harmful gender stereotype and was likely to cause serious offence".
Saeta Tech, which owns PixVideo, said it understood why the ad was likely to cause offence, but blamed its presentation and messaging, rather than the intended use of its product.
It said it prohibited the creation of nude or sexually explicit content and had automated detection and blocking tools to prevent such imagery from being generated.
The company has agreed not to show the ad again and has paused all advertising while it carries out an internal review.
The issue of apps that "declothe" women and girls without their consent hit the headlines in January, when Elon Musk's chatbot Grok was used to flood X with sexualised images.
The UK government announced in December that it would make it illegal to create and supply AI tools letting users edit images to seemingly remove someone's clothing.
The new offences will build on existing rules around sexually explicit deepfakes and intimate image abuse.



4 hours ago
23
















































