X, previously Twitter, is attempting to placate lawmakers in regards to the app’s security measures forward of a Big Tech Congressional hearing on Wednesday, which can focus on how corporations like X, Meta, TikTok, and others are defending youngsters on-line. Over the weekend, the social media firm introduced by way of Bloomberg that it could workers a brand new “Trust and Safety” middle in Austin, Texas which can embody 100 full-time content material moderators. The transfer comes over a 12 months after Elon Musk acquired the corporate, which noticed him drastically decreasing headcount, together with belief and security groups, moderators, engineers, and different workers.
In addition to this, Axios earlier reported that X CEO Linda Yaccarino had been assembly final week with bipartisan members of the Senate, together with Sen. Marsha Blackburn, in advance of the approaching hearing. The govt was mentioned to have mentioned with lawmakers how X was battling child sexual exploitation (CSE) on its platform.
As Twitter, the corporate had a tough historical past with correctly moderating for CSE — one thing that was the topic of a child security lawsuit in 2021. Although Musk inherited the issue from Twitter’s former administration, together with many different struggles, there was concern that the CSE downside has worsened below his management — notably given the layoffs of the belief and security workforce members.
After taking the reins at Twitter, Musk had promised that addressing the problem of CSE content material was his No. 1 precedence, however a 2022 report by Business Insider indicated that there have been nonetheless posts the place individuals had been requesting the content material. The firm that 12 months additionally added a brand new function to report CSE materials. However, in 2023, Musk welcomed again an account that had been banned for posting CSE photographs beforehand, main to questions round X’s enforcement of its insurance policies. Last 12 months, an investigation by The New York Times discovered that CSE imagery continued to unfold on X’s platform even after the corporate is notified and that extensively circulated materials that’s simpler for corporations to establish had additionally remained on-line. This report stood in stark distinction to X’s own statements that claimed the corporate had aggressively approached the problem with elevated account suspensions and modifications to search.
Bloomberg’s report on X’s plan to add moderators was mild on key particulars, like when the brand new middle can be open, as an illustration. However it did be aware that the moderators can be employed full-time by the corporate.
“X does not have a line of business focused on children, but it’s important that we make these investments to keep stopping offenders from using our platform for any distribution or engagement with CSE content,” an govt at X, Joe Benarroch, advised the outlet.
X additionally published a blog post on Friday detailing its progress in combatting CSE, noting that it suspended 12.4 million accounts in 2023 for CSE, up from 2.3 million in 2022. It additionally despatched 850,000 experiences to the National Center for Missing and Exploited Children (NCMEC) final 12 months, greater than 8 instances the quantity despatched in 2022. While these metrics are meant to present an elevated response to the issue, they may point out that these searching for to share CSE content material are more and more now utilizing X to accomplish that.