WWW.404MEDIA.CO
Meta Sues Nudify App That Keeps Advertising on Instagram
Meta said it is suing a nudify app that 404 Media reported bought thousands of ads on Instagram and Facebook, repeatedly violating its policies.Meta is suing Joy Timeline HK Limited, the entity behind the CrushAI nudify app that allows users to take an image of anyone and AI-generate a nude image of them without their consent. Meta said it has filed the lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms, Meta said.In January, 404 Media reported that CrushAI, also known as Crushmate and other names, had run more than 5,000 ads on Metas platform, and that 90 percent of Crushs traffic came from Metas platform, a clear sign that the ads were effective in leading people to tools that create nonconsensual media. Alexios Mantzarlis, now of Indicator, was first to report about Crushs traffic coming from Meta. At the time, Meta told us that This is a highly adversarial space and bad actors are constantly evolving their tactics to avoid enforcement, which is why we continue to invest in the best tools and technology to help identify and remove violating content.This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it, Meta said in a post on its site announcing the lawsuit. Well continue to take the necessary stepswhich could include legal actionagainst those who abuse our platforms like this.However, CrushAI is far from the only nudify app to buy ads on Metas platforms. Last year I reported that these ads were common, and despite our reporting leading to the ads being removed and Apple and Google removing the apps from their app stores, new apps and ads continue to crop up.To that end, Meta said that now when it removes ads for nudify apps it will share URLs for those apps and sites with other tech companies through the Tech Coalitions Lantern program so those companies can investigate and take action against those apps as well. Members of that group include Google, Discord, Roblox, Snap, and Twitch. Additionally, Meta said that its strengthening its enforcement against these adversarial advertisers.Like other types of online harm, this is an adversarial space in which the people behind itwho are primarily financially motivatedcontinue to evolve their tactics to avoid detection. For example, some use benign imagery in their ads to avoid being caught by our nudity detection technology, while others quickly create new domain names to replace the websites we block, Meta said. Thats why were also evolving our enforcement methods. For example, weve developed new technology specifically designed to identify these types of adseven when the ads themselves dont include nudityand use matching technology to help us find and remove copycat ads more quickly. Weve worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect within these ads.From what weve reported, and according to testing by AI Forensics, a European non-profit that investigates influential and opaque algorithms, in general it seems that content in Meta ads is not moderated as effectively as regular content users post to Metas platforms. Specifically, AI Forensics found that the exact same image containing nudity was removed as a normal post on Facebook but allowed when it was part of a paid ad.404 Medias reporting has led to some pressure from Congress, and Metas press release did mention the passage of the federal Take It Down Act last month, which holds platforms liable for hosting this type of content, but said it was not the reason for taking these actions now.
0 Comments 0 Shares 34 Views 0 Reviews