On February 24, 2025, the eSafety Commission announced a fine of approximately A$1 million ($640,000 USD) against Telegram for its delayed response to queries about measures to prevent the spread of child abuse and violent extremist material. The Commission had issued transparency reporting notices in March 2024 to several platforms, including YouTube, X, Facebook, WhatsApp, Google, Reddit, and Telegram, as part of its mandate to ensure compliance with the Online Safety Act 2021. These notices specifically required information on measures to tackle terrorist and extremist material, with additional queries for Telegram and Reddit regarding child sexual abuse material.

The deadline for responses was set for May 2024, but while most platforms complied, Telegram submitted its response in October 2024, resulting in a delay of approximately 160 days. This delay prompted the eSafety Commission to issue an infringement notice, citing obstruction of its functions under the Online Safety Act.

Telegram-on-Tablet

eSafety Commission’s role and powers

The eSafety Commission, established in 2015 as the Children’s eSafety Commissioner and later expanded in 2017, is Australia’s national independent regulator for online safety. Its purpose is to safeguard Australians from online harms, including cyberbullying, non-consensual sharing of intimate images, and child sexual abuse material. The Commission has regulatory powers, such as issuing notices and requiring the removal of harmful content, which were expanded under the Online Safety Act 2021.

According to Reuters, Commissioner Julie Inman Grant emphasized the importance of timely transparency, stating, “Timely transparency is not a voluntary requirement in Australia, and this action reinforces the importance of all companies complying with Australian law.” She noted that the delay hindered the Commission’s ability to implement safety measures effectively, particularly given the serious nature of the content in question.

Telegram’s response and appeal

Telegram, in response, asserted that it had fully addressed all of eSafety’s questions by October 2024, with no outstanding issues. The company described the fine as “unfair and disproportionate,” arguing that it concerns only the response timeframe and not the substance of their answers. Telegram intends to appeal the decision, with a 28-day window to either pay the notice, seek an extension, or request its withdrawal.

The unfair and disproportionate penalty concerns only the response time frame, and we intend to appeal.

The fine on Telegram occurs against a backdrop of international scrutiny, particularly following the arrest of its founder, Pavel Durov, in France in August 2024. Durov, a co-founder of Telegram and the social network VK, was indicted on twelve charges, including complicity in the distribution of child exploitation material and drug trafficking, and was placed under judicial supervision with a bail of about €5 million ($5.56 million USD). The French investigation focused on Telegram’s alleged lack of moderation and cooperation with law enforcement, reflecting similar concerns raised by the eSafety Commission.

Pavel-Durov-Telegram-CEO

Durov, who holds dual French and UAE citizenship, denied the allegations, calling the arrest “misguided” and suggesting that French authorities should have approached the company directly.

The eSafety Commission’s actions against Telegram are part of a broader effort to ensure compliance across social media platforms. For instance, the Commission has previously launched legal action against X for failing to provide information on how it tackles online child sexual exploitation material, with proceedings ongoing in the Federal Court. Similarly, in July 2024, the Commission issued periodic notices to eight providers, including Meta and TikTok, focusing on child sexual exploitation and abuse material, requiring bi-annual reporting for two years.

Prevalence of child abuse material in Australia

The urgency of these regulatory actions is underscored by statistics on the prevalence of child sexual abuse material online in Australia. According to the Australian Institute of Criminology, a 2023 study found that 0.8% of surveyed adults (over 13,000 participants) admitted to intentionally viewing child sexual abuse material in the past year, with higher probabilities among certain demographics, such as those aged 18–34 (1.2%) and those living with disabilities (1.5%). Additionally, the Australian Centre to Counter Child Exploitation (ACCCE) reported receiving over 33,000 reports of online child sexual exploitation in 2021.

These figures highlight the scale of the issue, with Australia’s spy agency, ASIO, noting that one in five priority counter-terrorism cases involves youths, often linked to online radicalization. This context justifies the eSafety Commission’s stringent measures and the fine imposed on Telegram.

The fine on Telegram serves as a warning to tech companies operating in Australia about the consequences of non-compliance with regulatory requests. If Telegram chooses to ignore the penalty, the eSafety Commission can seek a civil penalty in court, potentially escalating the legal battle. This action aligns with global trends, such as the European Union’s Digital Services Act that recently saw the removal of over 130,000 apps from Apple’s App Store, which also imposes obligations on platforms to tackle illegal content.

Hillary Keverenge
707 Posts

Tech junkie. Gadget whisperer. Firmware fighter. I'm here to share my love-hate relationship with technology, one unboxing at a time.

Comments

Follow Us