The UK's media regulator, Ofcom, has initiated a formal investigation into the messaging platform Telegram over concerns that it may not be adequately preventing the sharing of child sexual abuse material (CSAM).
Ofcom announced on Tuesday that it was launching the probe after gathering evidence suggesting CSAM was present and being distributed on the popular service. Under current UK law, user-to-user platforms must implement systems to prevent users from encountering illegal content like CSAM and have mechanisms to address it—failure to comply can result in substantial fines.
Telegram has strongly denied the allegations. In a statement, the company said: "Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with non-governmental organizations." It added, "We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy."
This investigation is part of Ofcom's wider enforcement of the UK's Online Safety Act, which imposes stricter requirements on tech companies to tackle illegal content, including CSAM, terrorism, grooming, and extreme pornography. The Act's illegal content duties came into effect in March 2025.
"Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities," said Suzanne Cater, director of enforcement at Ofcom.
Cater noted that while progress has been made in addressing CSAM on smaller services like file-sharing platforms, the issue "extends to big platforms too."
The probe has been welcomed by child protection organizations. The NSPCC highlighted that recent research shows around 100 child sexual abuse image offences are recorded by police daily in the UK. Rani Govender, associate head of policy at the NSPCC, stated: "The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it."
The Internet Watch Foundation (IWF), which works to identify and remove CSAM online, also expressed support. Emma Hardy, IWF communications director, said the organization shares concerns about "bad actor networks" on Telegram and that "not enough is being done to prevent known, detected, child sexual abuse imagery from being distributed." Hardy urged Telegram to expand safeguards across its platform, including to chats protected by end-to-end encryption.
Ofcom said it launched the investigation after being contacted by the Canadian Centre for Child Protection regarding alleged CSAM activity on Telegram. The regulator has also begun probes into services Teen Chat and Chat Avenue over potential grooming risks identified through work with child protection agencies.
"Teen-focused chat services are too easily being used by predators to groom children," Cater warned. "These firms must do more to protect children, or face serious consequences under the Online Safety Act."
Ofcom has the authority to impose fines of up to £18 million or 10% of a company's global revenue—whichever is higher—for non-compliance. While some firms have resisted enforcement, Ofcom noted that one file-sharing service it contacted has made "material improvements" to meet its duties.