icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
26 Nov, 2025 15:02

Meta turned blind eye to sex trafficking – court filings

The company had a policy allowing 16 violations, such as adults soliciting minors, before it suspended accounts
Meta turned blind eye to sex trafficking – court filings

Facebook’s parent company, Meta, failed to promptly act on accounts engaged in sex trafficking by allowing illicit content to remain on its platforms despite repeated violations, recently unsealed court filings show.

The accusation is part of a lawsuit filed in California by more than 1,800 plaintiffs – including school districts, children and parents, and state attorneys general – alleging that social media giants “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health.” Alongside Meta – which owns Facebook, Instagram, WhatsApp, and Threads – the suit targets Google’s YouTube, ByteDance’s TikTok, and Snap’s Snapchat.

Former Instagram safety chief Vaishnavi Jayakumar testified she was shocked to learn that Meta maintained a “17-strike” policy for accounts allegedly involved in human sex trafficking.

“You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” she said, calling the threshold “very, very high” by industry standards.

The brief alleges Meta was aware of serious harms on its platforms, including millions of adult strangers contacting minors, products that worsened teen mental-health issues, and frequent detection – but rare removal – of content related to suicide, eating disorders and child sexual abuse.

Responding to the allegations, Meta told USA Today it now enforces a “one strike” policy and immediately removes accounts involved in human exploitation, saying its former 17-strike system has been replaced.

The company has come under mounting scrutiny in the US. Earlier this year, reports that Meta’s AI chatbots could engage minors in sensual exchanges led to new safeguards for teen accounts, giving parents the option to block interactions with the bots.

Meta is also confronting expanding legal and regulatory challenges globally. Russia designated the firm an “extremist organization” in 2022 for refusing to remove prohibited content. The tech giant is facing multiple actions in the EU, including a €797 million antitrust fine tied to Facebook Marketplace, as well as separate copyright, data-protection and targeted advertising cases in Spain, France, Germany, and Norway.

Dear readers! Thank you for your vibrant engagement with our content and for sharing your points of view. Please note that we have switched to a new commenting system. To leave comments, you will need to register. We are working on some adjustments so if you have questions or suggestions feel free to send them to feedback@rttv.ru. Please check our commenting policy
Podcasts
0:00
25:48
0:00
28:15