The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) being shared.
Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform.
Under the current law, user-to-user services operating in the UK must have systems in place to prevent people from encountering CSAM and other illegal content, as well as mechanisms to tackle it - or risk huge fines for breaches.
Telegram said in a statement that it 'categorically denies Ofcom's accusations'.
'Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with [non-governmental organisations],' it told the BBC.
The company added: 'We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.'
It is part of a wider crackdown from Ofcom on services it suspects could be flouting the UK's sweeping online safety requirements - including toughened-up rules for tech firms to tackle CSAM, which it is illegal to possess or share in the UK.
'Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities,' said Suzanne Cater, director of enforcement at Ofcom.
She added that while there had been progress with tackling CSAM on smaller services, including file-hosting and sharing platforms, the issue 'extends to big platforms too'.
Children's charity the NSPCC welcomed Ofcom's Telegram probe.
'Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded by police every day,' said Rani Govender, its associate head of policy.
'The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it, including opening this investigation into Telegram.'
Ofcom launched its probe into Telegram after being contacted by the Canadian Centre for Child Protection regarding alleged CSAM presence on the app.
It has also begun investigations into services Teen Chat and Chat Avenue over potential grooming risks raised through its work with child protection agencies.
'Teen-focused chat services are too easily being used by predators to groom children,' Cater said.
'These firms must do more to protect children, or face serious consequences under the Online Safety Act.'
The Act's illegal content duties, which took effect in March 2025, require so-called user-to-user services like messaging apps and social networks to prove they are tackling 'priority illegal content'.
This includes CSAM, terrorism, grooming and extreme pornography.
Ofcom has issued several fines to providers accused of failing to comply with its duties for illegal content or age checks.
It possesses the authority to impose fines of £18m or 10% of global revenues - whichever is higher - for non-compliance.
However, its rules have met resistance from some firms, with US message-board 4chan recently mocking the regulator's threats with memes.
Despite this, Ofcom stated that one file-sharing service it contacted over its illegal content systems has made 'material improvements' to comply with its requirements.






















