Meta has removed multiple ads promoting AI tools that generate sexually explicit deepfake content of real individuals. A CBS News investigation discovered numerous ads for these apps on Instagram’s Stories feature, estimating “hundreds” across Meta’s platforms. Some ads claimed to allow users to view anyone naked by uploading a photo or video. Clicking on these ads redirected users to websites promoting the creation of explicit content using real people’s images. A Meta representative confirmed the removal of these ads, deletion of associated Pages, and permanent blocking of app URLs. Despite these efforts, ads for “nudify” apps persist on Apple’s app store, Google’s Play store, and Instagram. Meta’s Oversight Board also highlighted inconsistencies in content moderation, citing a case where a manipulated video remained online despite multiple scam reports. (Newser)
Meta Cracks Down on Ads for ‘Nudify’ Tools

Currencies on guard ahead of major central bank decisions, US data releases
4h ago
Timberwolves top Kings 117-103 without Edwards behind 24 points from Randle
57m ago
China's factory output, retail sales weaken in November
3h ago
China's home prices slide further in November
3h ago
Quinn Hughes scores in Minnesota debut as Wild beat Bruins 6-2 for 4th straight win
1h ago
Ex-South Korea President Yoon tried to provoke Pyongyang into armed aggression, prosecutor says
4h ago






