Context
- Safe Harbour Clause (Section 79, IT Act 2000): Grants intermediary platforms immunity from liability for user-generated content if they comply with government-prescribed due diligence.
- Current Proposal: I&B Ministry plans to revisit this immunity to hold platforms accountable for not removing “fake news.”
- Trigger: Recent incidents like the Pahalgam terror attack led to blocking of fake news-spreading YouTube channels, including Indian ones.
Relevance : GS 2(Digital Governance, Social Issues)
Government’s Concerns and Plans
- Curbing Fake News: The government wants intermediaries to proactively remove false content, not just respond to takedown notices.
- Due Diligence Reforms: Proposal to revise guidelines to ensure platforms self–monitor and take preventive action against misinformation.
- Statutory Fact Check Unit: A push to give legal authority to the PIB Fact Check Unit to classify and act on fake content.
Legal and Constitutional Dimensions
- Freedom of Speech: Any law or rule affecting content moderation must balance againstArticle 19(1)(a) (freedom of expression) and comply with Article 19(2) (reasonable restrictions).
- Bombay High Court Judgment: Struck down powers of PIB Fact Check Unit, stating it went beyond constitutional and legal limits.
- Government Response: MeitY plans to file an SLP (Special Leave Petition) in the Supreme Court challenging the Bombay HC ruling.
Challenges in Implementation
- No New Law Yet: The I&B Ministry prefers self-regulation over a statutory framework, citing the need for broader consultation.
- Ambiguity in Enforcement: Lack of clear legal mandate may lead to selective enforcement or perceived censorship.
- Conflict Between Ministries: Overlapping jurisdictions of MeitY and I&B may lead to regulatory confusion.
Broader Implications
- Accountability vs. Censorship Debate: Moves to restrict safe harbour could lead to concerns over government overreach and curbing of dissent.
- Global Precedents: Similar debates ongoing in the EU (Digital Services Act) and US (Section 230 of the Communications Decency Act).
- Tech Platform Liability: Increasing trend toward making platforms responsible for the content they host, especially concerning misinformation and hate speech.