The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2022


The Government of India notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 20221 [hereinafter “Amendment Rules”] on 28th October, 2022, introducing changes to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 [hereinafter “Intermediary Guidelines, 2021”].
On 6th June 2022, the draft rules were put out by the Ministry of Electronics and Information Technology [“hereinafter MeitY’] for public comments from stakeholders. was invited to provide comments on the draft rules. Several recommendations were made to MeitY, including the importance of having independent members as part of the Grievance Appellate Committee to ensure impartiality in adjudicating complaints impacting free speech, and the need for additional number of GACs.
The Amendment Rules bring about several changes to key aspects of the Intermediary Guidelines, 2021, including additional time-limits to takedown content, and the much-debated and discussed Grievance Appellate Committee. An overview of the amendments along with their significance has been reproduced below:


Major Developments:

  1. Increasing linguistic comprehensibility: The mandated requirement to use English, or any other language under the Eight Schedule, for prominently informing users of the privacy policy, user agreement and rules and regulations by intermediaries is an encouraging step. This would help ensure wider understandability of the rules by people from different linguistic and socio-cultural backgrounds. An intermediary which caters to a specific section of people or group of people, for instance, will be able to formulate its policies in the language widely used by its users, allowing for better implementation and compliance.

  2. Changes to the ‘Due Diligence’ clause to be observed by Intermediaries: The Amendment Rules alter the grounds of due diligence that intermediaries must observe. Sub-clauses (i) to (ix) of Rule 3(1)(b) lay down the kinds of content which the intermediaries are not to host. Sub-clause (x) of Rule 3(1)(b) which pertained to information that is “patently false and untrue written to harass a person for financial or cause injury” has been removed completely. Another key deletion is carried out under Rule 3(1)(b)(ii), through the removal of the words “defamatory” and “libelous”. The Press Release states that decisions passed by Court shall determine whether a statement was in fact, defamatory or libelous. In the same clause, an added proscription relating to ‘promoting enmity’ has been inserted.2

    In terms of additions, Rule 3(1)(a) has been amended to the effect that intermediaries shall now also be expected to ‘ensure compliance’ with its policies by the user.3 This indicates that the onus lies on the online platform to ensure that user-uploaded content does not run counter to its policies. Such additional compliances upon intermediaries can be found in earlier draft amendments to the Intermediary Guidelines, 2021, floated in June, 2022.
    [’s analysis on the 2021 Draft Amendment Rules can be found here].


  3. Time-limits leading to over-reactive content moderation: The Amendment Rules further amend Rule 3(2)(a)(i) which pertain to adjudication of requests for removal of information under grounds mentioned in Rule 3(1)(b). Now, any complaints made on the basis of grounds mentioned under Rule 3(1)(b) shall be resolved within 72 hours from the time of reporting. Only exemptions provided are for:

  • sub-clause (i) - “Belonging to another person and user does not have a right”;

  • sub-clause (iv) - “Patent, Trademark or Copyright infringements”; and,

  • sub-clause (ix) – “Violating any law for the time being in force”.

Any other requests for removal of information under grounds (i) to (x) shall be resolved by the Grievance Officer within 72 hours – a tight time-frame for adjudicating the factual and contextual elements of a request for removal. In light of the monumental amount of requests made, it can become extremely cumbersome to robustly adjudicate upon such requests in a fair and transparent manner.4 This essentially leads to a framework where intermediaries will actively takedown opinions impacting free speech and expression in an attempt to remain compliant with strict laws.
Such short time-frames burden intermediaries and pave the way for excessive censorship of legitimate speech.


   D. Government the final say on ‘legitimate’ speech? The seismic shift in issues surrounding content moderation and free speech, it         would seem, lies in the introduction of the Grievance Appellate Committee (GAC) under the newly added Rule 3A.
Grave concerns arise regarding the composition of the GAC. The Amendment Rules stipulate that
the Committee shall comprise solely of Central Government functionaries. A 3-member Committee will sit over content moderation complaints against intermediaries, constituted by one Chairperson and two members appointed by the Central Government, who are stated to be ‘independent’ members.

No clarification has been provided as to the qualifications, eligibility, or minimum experience for such ‘independent’ members. A sine qua non to independence is the existence of appropriate safeguards to an officer’s appointment and/or removal from their post. Going by the Rules, such an ‘independent’ member can only be fashioned in imagination if the appointments are made exclusively at the discretion of the Central Government. To guarantee independence of members, subject-matter experts must be selected from various stakeholders, such as, Civil Society Organizations (CSO), Industry, and Government. Proper rules must be formulated to ensure that the appointment and removal to such posts are not carried out arbitrarily. Such measures would also allow for more effective discourse on various aspects of platform governance.
There must also be numerous GACs appointed to handle the (prospectively) exorbitant amount of complaints that would be filed by users against intermediaries. Changes made to the due diligence clause for Intermediaries only paves the way for greater requests for removal of online content.
This is a case of substantive statutory provisions being formulated through the process of subordinate legislation without any parliamentary debates. Such provisions providing for appointments of committee, qualification of members and their selection process should be discussed in the parliament and should be part of the parent Act.


The Idea of a ‘Transparent and Open’ Internet:

The stated objective behind introducing the Amendment Rules is to ensure an Open, Safe, Trusted and Accountable Internet, a welcome step towards regulating online content. While some provisions can be lauded, (for instance, respect for fundamental freedoms and increased linguistic comprehensibility), there exist particular clauses where more clarity and nuance is the need of the hour. An overtly Executive-appointed GAC will lead to the curbing of legitimate free speech and expression. A bigger time-frame for adjudicating content moderation would give intermediaries the opportunity to analyze the issues relevant to the particular request for removal, without blindly removing content and thus impeding free speech.
There is much which is left to be desired from the latest Amendment Rules, however, what the general public can take away is that the Ministry of Electronics and Information Technology has shown regard to the suggestions made from various stakeholders at the MeitY Consultation in certain aspects. It must fine-tune the provisions of the Intermediary Guidelines, 2021 to allow legitimate free speech to flourish (“Open Internet”) without sacrificing users’ rights under the guise of state action.



1The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2022, available at:
2Rule 3(1)(b)(ii) of the Amendment Rules, 2022.
3Rule 3(1)(a) of the Amendment Rules, 2022.
4A deep dive into content takedown timeframes, The Center for Internet and Society, 30th November, 2019, available at: