Business

Meta allowed sex-trafficking posts on Instagram as it put profit over kids’ safety – JS

A new court filing alleges that Meta, the owner of Facebook and Instagram, had a policy allowing sex traffickers to post content related to sexual solicitation or prostitution 16 times before their accounts were suspended on the 17th strike. The filing accuses Meta, along with Google’s YouTube, Snapchat owner Snap, and TikTok, of prioritizing profit and user engagement over the safety of children. The plaintiffs seek damages and a court order requiring the companies to stop harmful conduct and warn users and parents about the addictive and dangerous nature of their products.

Meta denied the allegations, stating that they have made efforts to protect teens, such as introducing Teen Accounts with built-in protections. Snap also defended its platform, emphasizing its safety features and privacy measures. Google and TikTok did not immediately respond to requests for comment.

The filing cited internal company communications, research reports, and sworn depositions by current and former employees, which are sealed by the court. It claimed that Instagram recommended nearly 2 million minors to adults seeking to groom children and that Facebook’s recommendation feature was responsible for 80% of violating adult/minor connections.

The filing also alleged that Meta did not take adequate steps to address child exploitation imagery, despite identifying it with 100% confidence. Instead of automatically deleting such content, the company was hesitant to tighten enforcement for fear of false positives. The filing claimed that Meta prioritized user engagement over safety considerations, delaying changes that could have protected children from adult predators.

The lawsuit also criticized Meta’s impact on children’s mental health and the educational environment, accusing the company of lying to Congress about its efforts to increase user engagement. Internal messages revealed that Meta researchers considered themselves “pushers” and acknowledged that teens were hooked on their products despite negative effects on their well-being.

Overall, the filing painted a damning picture of Meta and other tech companies, alleging that they knowingly harmed children for financial gain.

Senate Hearing Reveals Alarming Statements from Former Meta Executives

In a recent Senate hearing, a former company representative (whose identity remains undisclosed) was asked whether Facebook could detect a correlation between increased usage of its platform by teenage girls and heightened signs of anxiety. The representative responded with a firm “No,” according to the filing.

Contradicting Meta’s public statement that only half a percent of users were exposed to suicide and self-harm content, the filing revealed that the actual number was closer to 7%.

Brian Boland, a former vice-president at Meta who served for over 11 years before leaving in 2020, expressed his skepticism in a deposition, stating, “My belief then and now is that they do not genuinely prioritize user safety.”

Additionally, when a Meta employee raised concerns about the company’s decision to hide “likes” on posts to safeguard children’s mental well-being, a member of Meta’s growth team callously responded, “It’s a social comparison app. [Expletive] get used to it,” as detailed in the filing.

Related Articles

Back to top button