India Evolving Rules on Internet Intermediary Liability and Content Regulation: The 2025 Debate
In 2025, the Indian government’s move to tighten control over online content and redefine intermediary liability has ignited legal and policy debates across the tech and civil liberties sectors. At the heart of the issue is the government’s evolving interpretation of Section 79 of the Information Technology Act, 2000, and the growing role of the Sahyog Portal in issuing takedown requests.
The New Regulatory Push
The central government has defended its decision to reduce safe harbour protections for internet intermediaries—entities like social media platforms, messaging apps, and content-sharing websites. According to the new framework, intermediaries must remove flagged content upon notice from the government, primarily through the Sahyog Portal managed by the Indian Cyber Crime Coordination Centre (I4C). Failure to comply could strip these platforms of their legal immunity under Section 79.
This shift in policy is currently under judicial review. The Karnataka High Court is hearing a challenge from the social media platform X (formerly Twitter), which contends that the government is overstepping its authority under Section 79 and bypassing the due process prescribed under Section 69A, which explicitly governs content blocking in cases of national interest.
Algorithmic Curation vs Traditional Editorial Judgment
A central argument from the government focuses on the difference between algorithmic content curation and editorial control in legacy media. Unlike traditional newspapers or broadcasters where editors make conscious, accountable decisions about what content is published or aired, online platforms rely on algorithms to prioritize, recommend, or suppress content at massive scale.
The government contends that this algorithmic amplification—especially of divisive or harmful content—lacks transparency, oversight, and editorial accountability. Therefore, the logic goes, digital platforms cannot be treated as neutral intermediaries and must be held to a higher standard of regulation than traditional media.
Legal Framework: Section 79 vs Section 69A
Under Section 79 of the IT Act, intermediaries are offered conditional immunity from liability for third-party content. However, this immunity is voided if they fail to “act expeditiously” upon receiving lawful orders for content removal.
X argues that such orders must fall strictly under Section 69A, which lays out a formal, narrowly defined process for blocking content that threatens India’s sovereignty, public order, or national security. Section 69A also includes due process provisions such as recorded reasons and review mechanisms, which X believes are being circumvented.
The government, however, insists that Section 79 serves a broader compliance purpose—covering content that may not fit Section 69A’s narrow categories but still requires regulatory action, such as misinformation, hate speech, or deepfakes.
The Anonymity Challenge
One of the government’s recurring concerns is the anonymity afforded by online platforms, which contrasts sharply with the accountability norms of traditional media. On social platforms, users can operate under pseudonyms or fake identities, making it difficult to trace harmful actors.
This, combined with algorithmic curation, leads to what the government calls “uncontrolled virality“—where extreme or false narratives can rapidly gain traction without clear accountability. As a result, the government argues for stricter regulatory oversight, not only of content but also of the platforms’ systemic mechanisms that facilitate such spread.
Platform Compliance and the Sahyog Portal
As of March 2025, 38 major internet intermediaries—including Google, Amazon, Microsoft, Apple, Telegram, and LinkedIn—have fully integrated with the Sahyog Portal. Meta Inc. has enabled API-based connectivity for Facebook, Instagram, and WhatsApp.
However, platform X has refused to comply, asserting that the government lacks the authority under Section 79 to demand content removal without a proper legal process. This dispute has added urgency to the legal scrutiny surrounding intermediary obligations and the boundaries of state power in the digital domain.
Balancing Free Speech with Public Order
The government maintains that its regulatory actions are not aimed at curbing free speech, but at balancing freedom of expression with the public interest. Content that promotes violence, communal tension, misinformation, or national insecurity, the government argues, must be restrained in a democratic society.
In its view, protecting the right of the public to receive truthful and safe information is as critical as protecting the right to speak. Thus, content regulation is framed as a collective safety issue, not just an individual rights matter.
The Road Ahead
India’s approach to intermediary liability in 2025 signals a paradigm shift in digital governance. The outcome of ongoing legal proceedings—especially X’s challenge in the Karnataka High Court—will likely set precedents for how far governments can go in directing online content takedowns without formal censorship mechanisms.
At stake is the future of platform accountability, user rights, algorithmic transparency, and democratic regulation in the digital age. As India attempts to strike a balance between innovation, freedom, and societal stability, the choices made today could define the structure of internet regulation for years to come.