Representative image indicating free speech being silenced
Credit: iStock Photo
The rise of social media has led to the free flow of information, but it has also led to the ghost of extremism, hate speech and cybercrimes that cannot be easily exorcised. Governments worldwide, along with social media platforms, are seeking ways to regulate this digital space. Justice M Nagaprasanna of the Karnataka High Court, while dismissing a writ petition challenging the use of the Sahyog portal for content notification to platforms, opined that the portal serves as a “beacon of cooperation between the citizen and the intermediary”. The Union government chose to utilise the Sahyog portal for content moderation across various social media platforms. The portal serves as a platform to enrol social media companies and provide them with takedown notifications from departments through automated means.
In March, while challenging the constitutionality of Section 79(3)(b) of the Information Technology Act, 2000, in the Karnataka High Court, X Corp (formerly Twitter) referred to the portal as a “censorship portal”. On September 24, the Karnataka High Court rejected X Corp’s claims and hailed the portal as a “public good”.
While it is understandable that the Union government has concerns regarding content regulation, it’s undeniable that the functioning of the portal and the impugned section of the IT Act raise significant questions about free speech, non-arbitrariness and due process. In the Shreya Singhal case, the Supreme Court laid down the law on the removal of content from the Internet, mandating that a written order with reasons be issued and communicated to the party concerned, along with an opportunity to be heard. Section 79(3)(b) is an exception to Section 79 of the IT Act, which is a safe harbour provision designed to protect social media intermediaries from liability for third-party content.
Furthermore, the limited power to take down content under Section 69A (mandatory procedural safeguards such as reasoned written orders, opportunities to be heard and judicial review) does not apply to Section 79(3)(b), which makes the entire exercise of using the portal and serving takedown notices susceptible to arbitrariness. This concern becomes even more pertinent since social media platforms have alleged that government entities, including state governments, even those unauthorised to do so, often issue takedown notices to silence dissent and curb the views of political opponents. This violates the apex court’s Shreya Singhal ruling, which restricts ‘actual knowledge’ triggering loss of safe harbour protection strictly to formal court or authorised agency orders. This shift places an undue burden on intermediaries and endangers freedom of expression.
Additionally, the expansive and vague scope of ‘unlawful information’, particularly under Rule 3(1)(d) of The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, often read in conjunction with Section 79(3)(b) to issue takedown notices, risks arbitrary removals and promotes over-censorship.
The use of the Sahyog portal for content regulation has one overlooked aspect: the use of automated means to serve takedown notices. Automation in the domain of content regulation has been critiqued on the grounds of the prevalence of bias in AI deployment, a lack of contextual understanding about a given post, making the process prone to flagging non-harmful content, and opacity in the automated system.
The Karnataka High Court compared the portal with the US’ Take It Down Act (TIDA), stating that X Corp cannot adopt double standards by choosing to abide by the former in the US while refusing to accept the latter. On a deeper look, however, the TIDA is a completely different regulation when compared to the Sahyog portal.
Albeit this, its explicit immunity for platforms acting in good faith to remove non-consensual intimate imagery under TIDA highlights a path forward for India. Its victim-driven removals and clear timelines stand in contrast to the Sahyog portal’s broad, opaque takedown orders issued by government agencies without the procedural safeguards mandated under Section 69A.
Likewise, the EU’s Digital Services Act (Articles 9, 15, 17, and 20) sets benchmarks for transparency, user notification, appeal mechanisms and out-of-court settlement for dispute resolution, offering users and intermediaries impartial avenues beyond platforms’ internal grievance processes, thereby reinforcing checks and balances. Furthermore, the DSA also focuses on transparent use of automated means for content moderation.
While acknowledging the government’s concerns, the Sahyog portal experience demonstrates the urgent need for a balanced co-regulatory framework that safeguards fundamental rights while addressing harmful content. India should thus focus Sahyog on specific egregious harms, require government orders to be reasoned and subject to appeal, and ensure human oversight for subjective content like hate speech.
(Utkarsh is a law student at RMLNLU; Harshita is a student at National Law University, Jodhpur)
Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.