Meta has removed multiple ads promoting AI tools that generate sexually explicit deepfake content of real individuals. A CBS News investigation discovered numerous ads for these apps on Instagram’s Stories feature, estimating “hundreds” across Meta’s platforms. Some ads claimed to allow users to view anyone naked by uploading a photo or video. Clicking on these ads redirected users to websites promoting the creation of explicit content using real people’s images. A Meta representative confirmed the removal of these ads, deletion of associated Pages, and permanent blocking of app URLs. Despite these efforts, ads for “nudify” apps persist on Apple’s app store, Google’s Play store, and Instagram. Meta’s Oversight Board also highlighted inconsistencies in content moderation, citing a case where a manipulated video remained online despite multiple scam reports. (Newser)
Meta Cracks Down on Ads for ‘Nudify’ Tools

Not Guilty: Suspect in Rihanna Home Shooting Formally Denies Charges
50m ago
Budapest Bound: The Killers to Headline 2026 UEFA Champions League Final Kick-Off Show
52m ago
The 'Dinner Party' Tour: Why Niall Horan is Trading Arenas for Storytelling in 2026
53m ago
A New Rival for Miranda? Lady Gaga Linked to Long-Awaited 'Prada' Sequel
53m ago
Candid and Connected: Inside BTS's 2026 Reunion with Jimmy Fallon
57m ago
UAE willing to join international force to reopen Strait of Hormuz, FT reports
1h ago






