ANALYSIS OF THE INFORMATION TECHNOLOGY (INTERMEDIARY GUIDELINES AND DIGITAL MEDIA ETHICS CODE) RULES, 2021

1. Introduction

The Ministry of Electronics and Information Technology (hereinafter “MEITY”) and the Ministry of Information and Broadcasting (hereinafter “MIB”) on 25.02.2021, notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.

In respect to the Over-the-top (hereinafter “OTT”) platforms or curated content, the Government in its press release has expressed the view that there have been widespread concerns about issues relating to digital contents both on digital media and OTT platforms or certification. A number of Public Interest Litigation as well as other cases against OTT platforms were also filed in various high courts and before the Supreme Court of India, calling for regulation of content that was being aired through the OTT platforms. 

In this blog post, we chalk down the history of these rules, their consultation process, on-going litigation, and the possible challenges put forth by the notification of these rules including the challenges to free speech and privacy. 

 

2. Consultation 

The draft Information Technology (Intermediary Guideline) Rules, 2018 were published in December, 2018 for public consultation by MEITY to review the existing Information Technology (Intermediary Rules), 2011. (You can read our comments and counter-comments here) In response to the 2018 consultations, MEITY had received 171 comments and 80 counter-comments. 

However, since then, MEITY and the MIB decided to notify the OTT Regulation and Code of Ethics for Digital Media along with the Intermediary Guidelines, 2021 without any consultation with the relevant stakeholders. This is in contravention of the Government of India’s Pre-Legislative Consultation Policy which prescribes that the department/ ministry must release the draft legislation in public domain, the key provisions to be explained in simpler language etc. for at least a minimum period of 30 days.

 

3. History of OTT Regulation and the Intermediary Guidelines Rules

 In April 2018, the Ministry of Information and Broadcasting passed an Order forming a 10 member committee to “frame and suggest a regulatory framework for online media/news portals including digital broadcasting and entertainment/ infotainment sites & news/ media aggregators”. The committee was subsequently disbanded in July, 2018 and the task was handed over to a panel overseen by MEITY. Creating a distinction between two separate categories of content, in December 2019, Atul Kumar Tiwari, the Additional Secretary, Ministry of Information and Broadcasting reportedly said “OTT (over-the-top video streaming industry) is a strange animal but our self-regulation model will only apply to curated content on streaming platforms (like Netflix, Amazon, Hotstar and others) while user-generated content and social media would be the reserve of the IT Ministry”. Subsequently on 09.11.2020, the President of India issued a notification under Article 77(3) of the Constitution, amending the Government of India (Allocation of Business) Rules, 1961 which granted MIB the power to regulate online news platforms and OTT platforms.

 

4. The Supreme Court Litigation 

In 2018, a petition was filed in the Madras High Court by Janani Krishnamurthy and Antony Clement Rubin seeking that Aadhaar must be linked with social media accounts of users. The reason behind this was to identify the users in case of illegal actions, trolling etc. The Madras High Court following the Puttaswamy-II judgment of 2018 decided against linking Aadhaar with the social media platforms. However, the issue of tracing the originator of a message on WhatsApp or Signal or any other application which has end-to-end encryption by default remained.

During the course of proceedings, an affidavit was filed by Professor V. Kamakoti, a professor from IIT-Madras and a member of the Prime Minister’s scientific advisory committee. The affidavit was to determine if tracing can be done in WhatsApp which follows end-to-end encryption based on the Signal Protocol. The affidavit filed by Prof Kamakoti in the Madras High Court stated that the traceability of originator is possible by adding an originator information with each message which will be displayed during the decryption.

Subsequently, other technology giants like Facebook and YouTube were impleaded as parties in the case. Facebook later on approached the Supreme Court because the matter was sub-judice in a couple of other High Courts such as  the Bombay High Court and the Madhya Pradesh High Court. The main argument made by Facebook / WhatsApp was that it uses end-to-end encryption which makes it impossible to track the originator of a message and that even WhatsApp does not have decryption keys and therefore, does not have access to the messages itself.

The Supreme Court is currently adjudicating on the issue of how the intermediaries can trace the originators of a message shared on their respective platforms.

 

5. The Information Technology (Intermediary Guideline) Rules, 2021 and the Code of Ethics and Procedure and Safeguards in Relation to Digital/ Online Media

The Information Technology (Intermediary Guideline) Rules, 2021 and the Code of Ethics and Procedure and Safeguards in Relation to Digital/ Online Media (hereinafter “rules” have been notified by the Central Government under Section 87 of the Information Technology Act, 2000. The Rules are in supersession of the Information Technology (Intermediaries Guidelines) Rules, 2011.

 

The rules classify and provide for separate regulations for intermediaries and other entities. The categorization of various entities along with their corresponding definitions is as given below:

 

Sr. No.

Entity

Definition

1.

News Aggregator

“an entity who, performing a significant role in determining the news and current affairs content being made available, makes available to users a computer resource that  enable  such  users  to  access  the  news  and  current  affairs  content  which  is  aggregated, curated and presented by such entity.” [Rule 2(o)]

2.

Publisher of news and current affairs content

an online paper, news portal, news aggregator, news agency and such other entity called by whatever name,which is functionally similar  to  publishers  of  news  and  current  affairs  content  but  shall  not  include  newspapers, replica e-papers of the newspaper and any individual or user who is not transmitting content in the course of systematic business, professional or commercial activity;” [Rule 2(t)]

3.

Publisher of online curated content

Means a publisher who, performing a significant role in determining  the  online  curated  content  being  made  available,  makes  available  to  users  a computer resource that enables such users to access online curated content over the internet or computer  networks,  and  such  other  entity  called  by  whatever  name,  which  is  functionally similar to publishers of online curated content but does not include any individual or user who is not transmitting online curated content in the course of systematic business, professional or commercial activity; [Rule 2(u)]

4.

Significant social media intermediary

a social media intermediary having number of registered  users in India above such threshold as notified by the Central Government.” [Rule 2(v)]

5.

Social media intermediary

an intermediary which primarily or solely enables online interaction between two or more  users and  allows  them to create,  upload, share, disseminate, modify or access information using its services. [Rule 2(w)]

 

II. DUE DILIGENCE REQUIREMENTS BY INTERMEDIARIES

Following is a list of major due diligence requirements that have to be observed by an intermediary, a social media intermediary and a significant social media intermediary:

a. Content take down: Rule 3(1)(d)

An intermediary, after receiving ‘actual knowledge’ through a court order or by being notified by a government agency, is bound to remove information that is prohibited by law in relation to interest of the sovereignty and integrity of India, the security of the state, friendly relations with foreign states, public order, decency or morality, contempt of court, defamation, incitement to an offence or information which violates any law which is in force. Such information has to be removed within a duration of thirty six hours from receipt of actual knowledge by the intermediary. 

While the rules have expanded the time from from twenty four hours in the draft rules to thirty six hours, It should also be noted that the strict requirement of thirty six hours for content removal in the case of orders issued by the government will not only take away the right of an intermediary to a fair and reasonable recourse when they are in disagreement with the order issued by a Government authority, but it will also impact free speech online.

 

b. Grievance redressal mechanism: Rule 3(2)(a)

While the Draft rules also contained a requirement for publishing the details of the Grievance Officer, his/her contact details and the grievance redressal mechanism; the new rules expand upon the scope of the process by requiring the grievance officer to acknowledge the complaint within twenty four hours and also sets a limit of fifteen days for disposing off the complaint. 

 

c. Prevention of proliferation of Non-Consensual Intimate Images (NCII): Rule 3(2)(b)

Within the grievance redressal mechanism, there is a specific and separate requirement in respect of non-consensual transmission of content which exposes the private area of any person, contains an image of a person in full or partial nudity or depicts such person in any sexual act or conduct or contains an impersonation, including artificially morphed images. There exists an additional qualification in respect of such content that such content is transmitted with the intent to harass, intimidate, threaten or abuse an individual. In respect of such material, the intermediary has to remove the content within twenty four hours from the time of receipt of a complaint. The intermediary also has to put in place a mechanism for receipt of complaints in this regard, where the individual can provide the details or communication link.

This is a much needed emphasis that has been put upon a contentious and problematic category of online content. This is a welcome move and it addresses the problem of publication of nudity/ sexual content without consent. Although there is a requirement of twenty four hours, in all such cases, intermediaries should operate on a best efforts basis.

 

III. DUE DILIGENCE REQUIREMENTS FOR SIGNIFICANT SOCIAL MEDIA INTERMEDIARIES

d. Appointment of personnel:

  • Chief Compliance Officer, Rule 4(a) - An intermediary has to appoint a Chief Compliance Officer who shall undertake the responsibility for ensuring compliance with the IT Act and the rules thereunder. The Chief Compliance Officer will be liable in any proceedings relating to third party information where he/she fails to make sure that due diligence was followed by the intermediary. No liability will be imposed upon the significant social media intermediary, without being granted an opportunity to be heard. The Chief Compliance Officer has to be a ‘key managerial personnel’ or a senior employee of the company and he/she has to be a resident in India.
  • Appointment of a Nodal Contact Person, Rule 4(b): A social media intermediary is also required to appoint a nodal person who shall be responsible for round the clock 24x7 coordination with law enforcement authorities and shall ensure that the Orders and requisitions sent to the company are complied with. The nodal contact person has to be an employee, someone other than the person holding the position of the Chief Compliance Officer and he/she has to be a resident of India.
  • Appointment of Resident Grievance Officer, Rule 4(c): The said officer shall be responsible for ensuring that the intermediary follows the due diligence requirements relating to the grievance redressal mechanism. The Resident Grievance Officer must be a resident of India.

The mandatory requirement can create hurdles for online messaging platforms like Signal or Telegram that do not have offices in India. Signal is run by a non-profit entity and it may not be possible for such an entity to establish offices or have dedicated personnel in the country. Similarly, for numerous other intermediaries that operate at a limited scale, this requirement will create an additional financial and operational burden. 

 

e. Compliance Reports, Rule 4(d) 

Significant Social Media  intermediaries must publish compliance reports every month, containing the details in respect of the complaints which were received and the action taken on those complaints, and the number of links or information removed while using proactive monitoring through automated tools. 

Mandating the publication of compliance reports is a positive step. Requiring significant social media intermediaries to publish the details of the process through which content was removed, facilitates more transparency in content moderation practices of intermediaries. Availability of more data in the public domain can also lead to better coordination among intermediaries. It can further lead to big technology companies taking up joint efforts in building better industry practices and tools. The requirement for monthly publication however must be changed to once in every six months, to ease the burden of compliance upon intermediaries.

 

f. Traceability Requirement or the Originator Requirement: Rule 4(2)

Rule 4(2) of the Rules is an amended version of the previous draft Intermediary Guidelines which was published in December, 2018. This provision requires a significant social media intermediary providing services primarily in the nature of messaging to identify “first originator” of the information on a computer resource. In order to acquire such information, the Rules require an order issued under Section 69 of IT Act or a judicial order. The provision is problematic in several quarters. 

Firstly, it is an either/or situation for a judicial order and an order under S. 69, IT Act. Section 69 does not offer adequate procedural safeguards and the orders under S. 69 are not available in the public domain. RTI requests filed by SFLC.IN seeking the number of decryption orders passed by the government each year have been denied in the past, citing S. 8 of the RTI Act, 2005. Law enforcement agencies can easily bypass the judicial process by relying on decryption requests made under S. 69 of the IT Act, 2000, thereby undermining accountability and transparency principles. 

Secondly, the provision provides that where the first originator of any information on the computer resource of an intermediary is located outside the territory of India, the first originator of that information within the territory of India shall be deemed to be the first originator of the information. In such a scenario, the intermediary should have access to the metadata of the entire chain of the conversation.  So, the messaging applications have to be re-engineered to capture metadata to implement this mandatory requirement. This provision will undermine end-to-end encryption and will severely impinge on security and privacy of communications. This is because in order to comply with the traceability provision, there is a high likelihood that the significant social media intermediaries will have to break end-to-end encryption and access the contents of a message, thereby compromising the privacy of communication. This will considerably weaken security of end-to-end encrypted platforms. This move would severely dent the privacy by design principle by acting both ways i.e. being a valuable target for malicious third parties. 

In addition to this, it would still pose a major challenge in courts to prove who the originator of information is. For instance, someone who took a screenshot of a tweet and shared it with a friend, may not be the actual originator of that information. It would potentially be a challenge in the court to attribute mens rea to such originators. 

In a nutshell, this provision will undermine privacy and right to free speech, and it will severely impact the sanctity of end-to-end encrypted communications. 

g. Automated Filtering : Rule 4(4)

Rule 4(4) requires that a Significant Social Media Intermediary “shall endeavour to deploy” technology based measures such as automated tools or other mechanisms for proactively identifying information that falls into the following categories: 

  • depicts any act or stimulation in any form depicting rape, child sexual abuse or conduct whether explicit or implicit 
  • information which is exactly identical in content to information that has been removed or access to which has been disabled.

The intermediary also has to display a notice to any user who is trying to access such information, stating that the information has been identified by the intermediary as falling under one of the categories. 

While the wording of the Rule does not explicitly mandate the usage of automated filters, it does in some way direct the companies to make an effort to put in place such tools. The purpose here seems to be to direct the companies which have the capacity or the capability to use such tools and technologies.

It is interesting to note that the provision is limited to rape and child sexual abuse material. However, it does not include sexual assault. In addition to this, it is unclear how the AI tools would respond to movies which show rape scenes or sexual assault. 

It is to be noted that automated tools are not always efficient in identifying images/videos with graphic content. For instance, Facebook had once removed the iconic ‘Napam girl photo’ from its platform after its automated tools adjudged the photograph for containing nudity. Automated take down tools often have a problem in understanding the context in which certain content is shared. The technologies which are used for automated filtering are still in their nascent stages of development. AI tools which are used for filtering content operate with limited efficiency when it comes to regional languages or languages which are not widely spoken. There is also a possibility that proactive take down of content may lead to large scale mass surveillance by private companies and increased automated censorship as platforms would err on the side of caution. This will also have an impact on free speech which forms the spine of a democratic nation. 

 

h. Grievance Redressal Mechanism: Rule 4(6) and 4(8)

Every significant social media intermediary must implement a mechanism where they receive complaints and grievances in relation to their failure to comply with the due diligence requirements given under the rules. Each complaint is to be allotted a unique ticket number which can be used to track the status of such complaints by the complainant. Intermediaries are also bound to give an explanation for the action taken or not taken, in respect of the complaint received by it. The aggrieved user must also be granted an opportunity to dispute the removal of content and request a reinstatement of access to such information. The decision in respect of adjudication of such a dispute must be done within a “reasonable” time.

This is a welcome provision and adds greater transparency in the take down process. There have been reports that social media companies have incorrectly or inadvertently removed content and have thereby impinged upon the right to freedom of speech and expression. There have also been reports where the social media intermediaries have bent to the will of the political establishment of countries and complied with requests for removal of content that was critical of the regime. This rule will at the least provide an opportunity to a user and will compel the social media intermediary to give an explanation for every decision taken by it. This is however a mammoth task considering the sheer volume of decisions taken by social media intermediaries on a daily basis.

i. Verification of Users

A significant social media intermediary is expected to enable users who have either registered for their services from India or are using their services in India, to “voluntarily” verify their accounts. The mechanisms for verification include “any appropriate mechanism” and include the active Indian mobile number of users. All accounts that have been verified must display a mark of verification which should be visible to every user. 

A number of social media intermediaries already have such processes in place. Since this is a voluntary requirement and does not compel a user to verify his or her credentials, it does not impact the digital rights of users.

j. Obligation in relation to news and current affairs content

An Intermediary has to give a proper notice on its website, that all publishers of news and current affairs content must furnish the details about their user accounts to the Ministry.

k. Non-applicability of Safe Harbour: Rule 7

If an intermediary fails to observe the rules, it shall not be entitled to the safe harbour protection under Section 79 of the Information Technology Act, 2000. 

 

IV. DIGITAL MEDIA

The rules made under Part III of the Rules classify its subjects into two categories namely i. Publishers of news and current affairs content; and ii. Publishers of online curated content. The rules will be administered by the MIB.

l. Grievance Redressal Mechanism: Rule 10

A publisher has to give an acknowledgment of the grievance to the complainant, within a period of twenty-four hours. The publisher must address the grievance and it has to convey its decision to the complainant, within fifteen days from the registration of the grievance. If the grievance remains unaddressed even after fifteen days, it will be escalated to the self regulating body at level two of the three tier grievance redressal mechanism. 

The regulation of online curated content or news and current affairs content is structured into three different levels.

  • Level 1: Rule 11 

This level consists of the grievance redressal mechanism constituted by the Publisher. Every publisher is supposed to appoint a Grievance Redressal Officer based in India who will be responsible for addressing the grievances. 

The said officer has to take a decision in respect of every grievance received by him, within a period of fifteen days and the decision has to be communicated to the complainant within the specified time. The said officer is also expected to be the point of contact for receiving grievances relating to the Code of Ethics and the nodal point for interaction with the complainant, the self regulating body and the MIB.

  • Level 2- Self Regulatory Mechanism: Rule 12 

Constitution: Level 2 consists of the self regulatory body which is expected to be an independent body made by an association of the publishers. There can be more than one self regulatory body and it has to be headed by a retired judge of the Supreme Court or a High Court or an ‘independent eminent person’ who belongs to the field of media, broadcasting, entertainment, child rights, human rights or any such relevant field. The self-regulatory body shall have up to six other members who shall be experts from the aforementioned fields.  

The self-regulatory body must be registered with the MIB. If such a body has been constituted before the notification of the rules, it must be registered within thirty days and if it is constituted after the notification of the rules, it must be registered within thirty days from the date of its constitution. 

The self-regulatory body is supposed to ensure that the publishers adhere to the Code of Ethics, provide guidance on the same to them, address grievances which remain unresolved with the publishers and hear appeals which are filed by the complainants against the decision given by the publishers. The body also has to power to issue guidance or advisories to the publishers, ‘warning, censuring, admonishing or reprimanding’ them or requiring an apology, warning card or a disclaimer from the publisher. The self-regulatory body can further direct the publisher of online curated content to reclassify the ratings of content, make modifications in the age classification and access control measures, and refer content to the Ministry for consideration. 

 

  • Level 3- Oversight Mechanism: Rule 13 

The final level comprises an Oversight Mechanism which is constituted by the Ministry, to ensure adherence to the Code of Ethics by the publishers. The Ministry has been empowered to designate an officer of the ministry, not below the rank of a Joint Secretary to the Central Government, as the “Authorized Officer”. The authorized officer shall have the power to initiate the procedure for deletion, blocking or modification of information by the publisher, and for blocking of information in case of an ‘emergency’. 

The Inter-Departmental Committee shall be constituted by the ministry which will have representatives from various ministries and domain experts. The committee will be headed by the Authorized Officer and it shall hear and examine complaints or grievances and make recommendations to the MIB accordingly. The set of recommendations include ‘warning, censuring, admonishing or reprimanding’ entities, requiring an apology or requiring them to issue a warning or a disclaimer. If the Committee is satisfied that there exists the need to take action for blocking of content under Section 69A of the IT Act (Power to issue directions for blocking for public access of any information through any computer resource), it may send recommendations to the MIB to do so. 

The Oversight Mechanism shall establish an Inter-Departmental Committee for hearing grievances, shall refer grievances to the said committee and shall issue guidance, advisories and directions to publishers. The Inter-Departmental Committee has the onus of hearing complaints in relation to grievances received from Level I or Level II and the ones which are referred to it by the MIB. 

While attempts at having a single level of self regulatory mechanism for OTT content did not materialize, the culmination of that failure into the current mechanism as laid down in the rules, is marred by erroneous deficiencies. Giving the power to the MIB to decide what content is aired and what content gets blocked, and extending that control even over news content, will have a chilling effect on the right to freedom of speech and expression. An argument that is often cited to justify the regulation of online content is that the restrictions which apply to televised content and films, should also be applicable to online content. The MIB not only has the power to refer grievances to the Inter-Departmental Committee constituted by it but it also has the authority to issue guidance and advisories to publishers. The Inter-Departmental Committee can also send recommendations to the MIB for warning, censuring, admonishing or reprimanding an entity. This sets a dangerous precedent where freedom of speech and expression, the freedom of press and the right to dissent will all be at the mercy of the government or its agencies. In November 2016, the Government of India had banned the Indian news channel NDTV India, for a day. While the government had cited a breach of national security laws, the Editors Guild of India had condemned it by terming it a violation of the freedom of media. The existing mechanism for redressal of grievances can throw up more such instances of news content being blocked, thereby threatening free speech. 

 

m. Disclosure of information by publishers: Rule 18(1) and (2)

A publisher of news and current affairs content as well as a publisher of online curated content which is operating in India, must inform the Ministry in respect of details of its entity and it must furnish information and the specified documents for facilitating communication and coordination. 

n. Compliance Reports by publishers: Rule 18(3) 

Every publisher of news and current affairs content as well as the publisher of online curated content must publish a compliance report every month. The compliance report should contain the details in respect of grievances received by the publisher and the action taken on those grievances.

 

6. CONCLUSION

Positive

Negative

Neutral

Rule 3(2)(a) - The rule mandates an intermediary to remove any non consensual intimate imagery, within a period of twenty four hours.

Rule 3(d) - Content take down period of thirty six hours.

Rule 4(7) - Voluntary verification of user accounts - A significant social media intermediary has to enable users to voluntarily verify their accounts. Since the requirement is voluntary, it does not impact digital rights of users.

Rule 4(1)(d) - Publication of Compliance Reports is a positive step as it ensures transparency in content moderation practices of social media companies. It also makes data publicly available which will enhance the efficiency of content moderation tools and technologies. The only demerit in this rule is that the reports have to be published monthly. This requirement should be changed to once in six months. 

Rule 4(a) - Appointment of personnel - The rule mandates all the significant social media intermediaries to appoint a Chief Compliance Officer, a nodal contact person and a Resident Grievance Officer. All the three appointees have to be a resident in India, which creates additional hardships for multinational corporations and non-profit entities like Signal.

Rule 3(2)(a) - Grievance Redressal mechanism of intermediary mandates the intermediary to acknowledge the complaint within twenty four hours and dispose of the complaint within 15 days.  The strict timelines could be a burden for the smaller players

Rule 4(8) - Dispute resolution mechanism for content removal: Significant social media intermediaries have to provide an explanation to the users, in respect of removal of content. The rule also mandates that the entity should give an opportunity to an aggrieved user to dispute the action taken by the intermediary and the decision in this respect has to be taken within a reasonable period of time. Although this requirement is burdensome for social media companies, is a much needed one. There have been many instances where social media companies inadvertently or otherwise, ended up blocking harmless content. This requirement will bring more transparency and will ensure private players give due consideration to the rights of users while filtering content. 

Rule 4(2) - Traceability requirement: It undermines privacy and the right to free speech, and it will severely impact the sanctity of end-to-end encrypted communications. 

Rule 4(4) - Automated tools for filtering information: The requirement for using automated tools has not been mandated. The requirement for an endeavour in this regard has only been imposed upon “significant social media intermediaries”, leaving aside smaller companies that may not be able to afford or implement automated tools

Rule 4(3) - Requirement for labelling information to users that the content is being advertised, marketed, sponsored, owned or exclusively controlled.

Rule 4(5) - The local office requirement creates operational difficulties for multinational companies and adds a financial burden. 

 
 

Rule 11-14 - Grievance Redressal Mechanism for publishers: The three tier grievance redressal mechanism gives MIB the power to issue orders, directions, advisories etc. to publishers. It also places the Inter-Departmental Committee (constituted by the MIB) as the final adjudicatory authority in respect of complaints received at Level I and Level II.