Before a Big Tech congressional hearing on Wednesday, X, Meta, TikTok, and others will discuss how they safeguard minors online. X, previously Twitter, is seeking to reassure lawmakers. Over the weekend, Bloomberg reported that Facebook would hire 100 full-time content moderators for its Austin, Texas, “Trust and Safety” center. Elon Musk purchased the firm more than a year ago and substantially cut staffing, including trust and safety teams, moderators, engineers, and others.
Axios also claimed that X CEO Linda Yaccarino met with bipartisan Senate members, including Sen. Marsha Blackburn, last week before the hearing. The executive told legislators how X was fighting child sexual exploitation (CSE) on its network.
Twitter struggled to moderate CSE, which led to a 2021 child safety lawsuit. Musk inherited the CSE problem from Twitter’s prior management, along with many other issues, but the trust and safety team’s layoffs have raised concerns that it has deteriorated.
Musk pledged to handle CSE material as his top priority at Twitter, but a 2022 Business Insider investigation showed that users still requested it. The corporation added CSE reporting that year. In 2023, Musk reinstated a CSE image-posting account, raising issues about X’s policy enforcement. The New York Times revealed last year that CSE images and extensively shared, easier-to-identify content remained live on X’s platform after the firm was warned. This revelation contradicted X’s claims that it had actively addressed the issue with account bans and search adjustments.
Bloomberg reported on X’s plan to add moderators, but it didn’t say when the new center would open. It said the moderators would be full-time corporate employees.
“X does not have a line of business focused on children, but it’s important that we make these investments to keep stopping offenders from using our platform for any distribution or engagement with CSE content,” X executive Joe Benarroch told the publication.
On Friday, X posted on its CSE efforts, saying that it had banned 12.4 million accounts in 2023, up from 2.3 million in 2022. We sent 850,000 reports to the National Center for Missing and Exploited Children (NCMEC) last year, eight times higher than in 2022. These measures are designed to reflect a greater reaction to the problem, but they may also suggest that more people are using X to spread CSE material.