
Meta has sued Joy Timeline HK Limited after the company repeatedly evaded ad filters to promote CrushAI “nudify” apps that use AI to remove clothing from images without consent.
At a Glance
- Meta filed a lawsuit in Hong Kong against Joy Timeline HK, developers of CrushAI nudify apps
- CrushAI ads appeared on Facebook, Instagram, Messenger, and Threads despite policy bans
- Tens of thousands of ads ran across more than 135 Facebook pages and 170 business accounts
- Meta claims CrushAI repeatedly circumvented its ad review systems since 2023
- The company is also deploying new detection tools and sharing data via the Tech Coalition program
Explosive Ad Campaign Exposed
Meta alleges that CrushAI ads promised users they could “erase any clothes” and “see anyone naked” via ads running across Facebook, Instagram, Messenger, and Threads. Despite multiple removals since 2023, tens of thousands of ads persisted across more than 135 Facebook pages and 170 business accounts, according to Meta’s filing. Investigators at CBS News uncovered hundreds of such ads still active, prompting Meta’s legal escalation.
Watch a report: Meta’s Lawsuit Aims at Deepfake Nudify Ads.
Meta’s Legal and Tech Response
The lawsuit, filed in Hong Kong, seeks to halt ad placements from Joy Timeline HK Limited and block CrushAI from further advertising on Meta platforms. According to Investopedia, Meta also announced the rollout of AI-driven detection tools and a plan to share over 3,800 URLs linked to nudify tools through the Tech Coalition’s Lantern threat-sharing program.
Ethical Concerns and Wider Impact
Critics warn these nudify apps enable non-consensual intimate imagery and facilitate blackmail, sextortion, and abuse—especially targeting women and minors. The New York Post recently reported on teens using such tools to generate fake nudes of classmates. Separately, the San Francisco City Attorney’s office has sued 16 deepfake sites for similar offenses involving AI exploitation and harassment.
Beyond immediate abuses, experts caution that these tools normalize consent violations and are fast becoming tools of cyberbullying and fraud. Meta’s combined legal and technical actions signal a broader industry reckoning with the proliferation of AI-enabled visual manipulation and its legal and ethical boundaries.