Introduction: The Online Safety Act gets its teeth
The United Kingdom's communications regulator, Ofcom, has initiated a formal investigation into the messaging platform Telegram, citing evidence that the service is being used to share child sexual abuse material (CSAM). This move marks one of the first major enforcement actions under the country's new Online Safety Act (OSA), signaling a new era of regulatory scrutiny for tech companies operating in the UK.
The investigation into Telegram, announced on May 29, 2024, follows earlier information-gathering notices sent in February to three other platforms popular with young people: Yubo, Skout, and MeetMe. Ofcom is leveraging its newfound powers to compel these companies to disclose how they are protecting children from illegal and harmful content. The outcome of this investigation will set a critical precedent for how online platforms are held accountable for user safety on their services.
Technical analysis: Platform features as vectors for abuse
This case does not revolve around a traditional software vulnerability or a system hack. Instead, the investigation focuses on how Telegram's core design and features, while intended to promote privacy and communication, can be systematically misused for criminal activity. The technical scrutiny from Ofcom will likely center on several key areas.
Channels and Private Groups: Telegram allows the creation of massive public and private channels, some with hundreds of thousands of members. While public channels are more easily monitored, private, invite-only groups can become insulated ecosystems for sharing illegal material. The ease of creating and sharing links to these groups allows illicit communities to form and persist, often just below the surface of automated detection systems.
The Encryption Debate: Telegram's brand is heavily associated with strong security. It's important to distinguish between its different chat types. Its "Secret Chats" feature uses end-to-end encryption (E2EE), meaning only the sender and recipient can read the messages. However, its standard cloud chats, groups, and channels are encrypted between the client and the server, which means Telegram has technical access to the content and can moderate it. Despite this, the platform's reputation for privacy attracts users who wish to evade detection. Ofcom's investigation will undoubtedly probe the effectiveness of Telegram's moderation of these non-E2EE spaces, where the bulk of large-scale sharing is believed to occur.
Content Moderation at Scale: Like all platforms reliant on user-generated content, Telegram faces the immense challenge of policing billions of messages daily. The investigation will assess the adequacy of its content moderation systems, which typically rely on a combination of user reports, AI-driven hashing technology (like PhotoDNA) to detect known CSAM, and human review teams. Ofcom will want to know if Telegram's investment in these safety measures is proportional to the risks present on its platform.
Impact assessment: High stakes for platforms and users
The implications of Ofcom's investigation are substantial and extend beyond the companies directly involved.
For the Platforms: Telegram, Yubo, Skout, and MeetMe face severe consequences if found non-compliant with the Online Safety Act. Ofcom has the power to levy fines of up to £18 million or 10% of a company's global annual revenue, whichever is higher. For a platform with Telegram's reach, this could amount to a staggering financial penalty. Beyond fines, the regulator can force platforms to implement specific safety technologies or, in extreme cases, compel internet service providers to block access to the service in the UK. The reputational damage from being formally identified as a hub for CSAM can also be devastating to user trust and growth.
For Children and Society: The primary victims are the children exploited in the creation of CSAM and those exposed to harmful content on these platforms. A successful enforcement action by Ofcom could compel these services to become safer environments, forcing them to invest more heavily in proactive detection and faster removal of illegal material. This serves the broader public interest in making the internet a safer place for its most vulnerable users.
For the Tech Industry: This investigation sends a clear message to all online platforms operating in the UK: the era of self-regulation is over. Companies will be watching closely to see how Ofcom interprets its powers and what standards of care it expects. This will likely trigger proactive reviews of safety policies and moderation systems across the industry to avoid similar regulatory action.
How to protect yourself and your family
While regulators and platforms have a duty to ensure safety, users can also take steps to mitigate risks. The responsibility for online safety is shared, and proactive measures at home are essential.
For Parents and Guardians:
- Start a Conversation: Talk openly with your children about the apps they use and the people they interact with online. Create an environment where they feel comfortable reporting anything that makes them feel unsafe or uncomfortable.
- Review Privacy Settings: Sit down with your child and review the privacy and security settings on their accounts. On platforms like Telegram, this includes controlling who can add them to groups, who can see their phone number, and who can message them directly.
- Use Platform Reporting Tools: Teach your children how to use the built-in report and block features on all social media and messaging apps. Reporting illegal content is a critical step in getting it removed and protecting others.
- Understand the Apps: Take time to understand how the apps your children use actually work. What are the group features? Is there live streaming? Knowing the platform's capabilities helps you understand the risks.
For All Users:
- Be Skeptical of Invitations: Do not join random groups or channels, especially if you are invited by someone you do not know. These can be vectors for scams, misinformation, and exposure to harmful content.
- Guard Your Personal Information: Avoid sharing sensitive personal details like your phone number, home address, or school information in public or semi-public groups.
- Maintain Digital Privacy: For general online activities, using tools that enhance your privacy protection can help shield your personal data from advertisers and trackers. This is a good practice for overall digital hygiene.
Ofcom's move against Telegram is a defining moment for online regulation in the UK. It places the difficult and contentious balance between user privacy and the absolute necessity of protecting children squarely in the spotlight. The industry, regulators, and the public will be watching as this test case for the Online Safety Act unfolds.




