Meta has removed multiple ads promoting AI tools that generate sexually explicit deepfake content of real individuals. A CBS News investigation discovered numerous ads for these apps on Instagram’s Stories feature, estimating “hundreds” across Meta’s platforms. Some ads claimed to allow users to view anyone naked by uploading a photo or video. Clicking on these ads redirected users to websites promoting the creation of explicit content using real people’s images. A Meta representative confirmed the removal of these ads, deletion of associated Pages, and permanent blocking of app URLs. Despite these efforts, ads for “nudify” apps persist on Apple’s app store, Google’s Play store, and Instagram. Meta’s Oversight Board also highlighted inconsistencies in content moderation, citing a case where a manipulated video remained online despite multiple scam reports. (Newser)
Meta Cracks Down on Ads for ‘Nudify’ Tools

Man dead, woman injured after motorcycle crash near Valley City
2h ago
Multiple crews respond to fire at vacant building in north Moorhead
3h ago
2025 Minnesota Gopher Football Preview
4h ago
Major Russian drone and missile attack on Ukraine kills 1, injures 15
4h ago
Flags in Minnesota to fly at half-staff to honor fallen firefighter
1h ago
UK firms report strongest activity in a year in August PMI survey
4h ago