(San Francisco)
The initial goal of this “Center of Excellence for Security” will be to recruit around “a hundred moderators”, focused above all on this type of messages, but also on other violations of the rules of the social network, indicated Saturday to AFP Joe Benarroch, director of operations at X.
“X does not have a child-focused business,” Benarroch emphasized, “but it is important that we make these investments to prevent offenders from using our platform for ANY type of distribution or engagement with content relating to sexual abuse of minors”.
The company bought by Elon Musk at the end of 2022 published a press release on Friday on its efforts in this area, saying it was “determined to make X inhospitable for actors who seek to exploit minors”.
Mr. Benarroch also reminded that children under 13 cannot open an account. Minors who register are subject to stricter rules in terms of data confidentiality, and are not targeted with advertising.
The announcements come ahead of a major hearing in the U.S. Senate on Wednesday, titled “Tech Giants and the Online Child Sexual Abuse Crisis.”
The judicial committee summoned the bosses of Discord, Meta, Snap, TikTok and X. Linda Yaccarino, the general director of X, will be present.
She was already in Washington last week for meetings with elected officials from both sides on the subject of child protection, network moderation, disinformation and artificial intelligence.
“We wanted to help senators and their aides understand what a new company X is, and explain to them what has changed in the last 14 months […] particularly regarding the sexual exploitation of children,” explained Joe Benarroch.
He also specified that more than 2,000 people (employed by X or subcontractors) moderate the content.
Elon Musk bought Twitter, promising to bring back “freedom of expression”. Many rules were removed or relaxed, and many banned figures were able to return.
In December, Brussels opened a “formal investigation” targeting X for alleged breaches of new European rules on content moderation and transparency, such as too few moderators or ineffective reporting of illegal content.