X to ramp up its recruitment of 100 content moderators

  • False information continues to spread on the X platform, with many of the employees responsible for monitoring content let go to save costs.
  • Taylor Swift’s AI photo accident is expected to speed up Musk’s steps to plug the holes.

Is Elon Musk rethinking his content moderation strategy?

On Saturday, Joe Benarroch, head of business operations at Musk’s X social media platform, said it would open a new content moderation center in Austin, Texas, and hire 100 full-time employees. The hub will focus on cracking down on child abuse-related content and enforcing the platform’s hate speech restriction policy, which it hopes to accomplish by the end of the year.

In a blog post on Friday, BTW Media noted that Company X had frozen 12.4 million accounts last year for violating rules against child sexual exploitation, a significant increase from 2.3 million the year before. X said the new Austin office will assist the company in fighting other types of harmful content as well.

In October 2022 Musk slashed hundreds of content moderator roles, raising concerns about the rise of hate speech on the platform. Musk, a self-described “free speech absolutist,” has excoriated Platform X for “fundamentally undermining democracy by failing to uphold the principles of free speech.”

Also read:Elon Musk’s X faces legal hurdles: Content moderation law sparks social media dilemma
It is worth noting that users of the X platform must be at least 13 years old, and 13-17 year olds account for less than 1% of its daily users. But the platform is awash with questionable content, leading to stagnant ad revenue. Analysts believe that the establishment of the content review center may be Musk rethinking his strategy.

Digital Services Act requires compliance

As early as November 2023, a senior European Commission official said that all of Elon Musk’s X social media platforms only have 2,294 content moderators to ensure that users comply with the EU’s online content rules, far fewer than Google and TikTok. The European Union recently passed the Digital Services Act (DSA), which requires 19 major web platforms and two major web search engines (among them Google, X, TikTok, Apple, Meta, and Microsoft) to do more to tackle illegal and harmful content on their platforms.


Fei Wang

Fei Wang, a reporter at BTW media dedicated in Internet Governance and IT infrastructure. She is studying bilingual broadcasting and hosting at Communication University of Zhejiang. Send tips to f.wang@btw.media

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *