Social media platform X, owned by Elon Musk, has agreed to review reports of suspected illegal hate and terrorist content within an average of 24 hours under commitments accepted by UK regulator Ofcom.
Ofcom's online safety director Oliver Griffiths called the commitments a "step forward," particularly in light of recent religiously-motivated crimes targeting Jewish communities in the UK.
The announcement follows Ofcom's compliance program launched in December, which assesses whether major social media platforms have adequate systems for handling reports of illegal hate and terror material. Griffiths noted evidence that such content "persists on some of the largest social media sites."
Under the agreement, X will submit performance data to Ofcom every three months for a year. While the target is to average under 24 hours for reviews, X also pledged to assess at least 85% of reports within 48 hours.
Two additional commitments were made: X will consult experts on reporting systems for illegal content, addressing concerns from organizations that flagged multiple pieces of suspected content but received no confirmation of action. Secondly, X will withhold UK access to accounts determined to be operated by or on behalf of terrorist organizations proscribed in the UK.
Danny Stone, chief executive of the Antisemitism Policy Trust, called the action a "good start" but said X was "failing in so many regards to tackle open racism on its platform." Iman Atta, director of Tell Mama, welcomed the updated targets, saying they signaled "a more accountable approach," but added that the test is "not what is promised, but what is delivered."
Ofcom's separate investigation into X's AI tool Grok, over concerns it was used to create sexualized images, remains ongoing.