+

NEW SOCIAL MEDIA CODE AND THE BIRTH OF A NEW REGIME OF INTERMEDIARY LIABILITY

On 26 February 2021, the Ministry of Electronics and Information Technology (MeITY) proposed the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 for social media platforms, OTT services and digital media. The Code is a reinforcement of the Information Technology (Intermediaries Guidelines) Rules, 2018 wherein the regulation of social media companies has […]

On 26 February 2021, the Ministry of Electronics and Information Technology (MeITY) proposed the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 for social media platforms, OTT services and digital media. The Code is a reinforcement of the Information Technology (Intermediaries Guidelines) Rules, 2018 wherein the regulation of social media companies has been strengthened making them more accountable for content on their platforms. The most debated provision of the new code has been the categorisation of media intermediaries and the birth of the ‘responsibility-liability’ regime of intermediaries.

The proposition of intermediary liability is not new, in the past, a scandal involving the sale of a pornographic clip on Baazee.com (now ebay.in) and the subsequent arrest of the company’s CEO led to the creation of a committee to re-evaluate the Information Technology Act. The committee recommended that intermediaries must do their ‘due diligence’ in order to receive immunity, which was accepted by the government. Also, the Supreme Court has in Shreya Singhal v. Union of India and My Space Inc. v. Super Cassettes Industries Ltd., acknowledged the concept of actual and specific knowledge and observed that intermediaries can be held liable if they have actual or specific knowledge of the existence of infringing content on their website from content owners and despite such notice, they do not take down the content. For these reasons, the Information Technology (Intermediary Guidelines) Rules, 2011 was introduced to create more accountability on the part of intermediaries.

In 2018, the intermediary liability rules were reconsidered and it was triggered by the misuse of social media by criminals and anti-national elements. Digital platforms have failed to proactively deal with cases that led to misuse of data and free speech majorly owing to their enormous size. There is harmful content on various platforms from fake news to child pornography. More recently, Twitter Inc. permanently banned former U.S. President Donald Trump’s account alleging that he had been breaking the rules against glorifying violence, manipulating media and sharing unsubstantiated information regarding the U.S. elections.

The need for stringent laws with regards to the liability of social media platforms in India escalated after the incident of Farmer’s protest taking a violent turn on 26th January 2021. The MeITY directed Twitter to take down accounts that used incendiary hashtags during the violence however, the company did not comply with it and the ministry said that the platform had to adhere to the authority’s directions and non-compliance will lead to criminal charges against the platform.

The Information Technology Act, 2000 along with the Intermediaries Guidelines, 2011 provided a safe harbour for the Intermediaries in India however, a need has been felt for content curation and also holding intermediaries liable for content published on social media platforms. The term “intermediary” has been defined under Section 2(w) of the IT Act as “any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, web-housing service providers, search engines, online payment sites, online auction sites, online market places and cyber cafes.”

The role of an intermediary is not to create information, but to receive, store and transmit it. The safe harbour protection under the IT Act applies only to “passive intermediaries”. The Delhi High Court in Christian Louboutin SAS v. Nakul Bajaj and Ors. held that as long as intermediaries are mere conduits or passive transmitters of the records or of the information, they continue to be intermediaries and they should not be “active participant”.

The Information Technology Act, 2000 is the primary legislation dealing with the liability of intermediaries for content generated by third parties. The act was amended in 2008 to include safe harbour under section 79 of the act and also to amend the definition of intermediaries. Thereafter, the intermediary guidelines, 2011 were introduced to incorporate due diligence in pursuance of the rules stated therein, in order to claim safe harbour protection under the IT Act. The IT Act and the Intermediaries guidelines, 2011 were to be read in consonance with each other.

The intermediaries do not create the content available on their platform, they merely act as a bridge between the content creators and consumers. The traditional point of law was based on the nation that intermediaries cannot be held liable and accountable for everything posted by any third party considering the vast amount of data produced every day. Another point of technical concern was the impossibility to track every act that qualifies as harmful or controversial.

The Supreme Court in Shreya Singhal case has also observed that “it would be very difficult for intermediaries like Google, Facebook etc. to act when millions of requests are made and the intermediary is then to judge as to which of such requests are legitimate and which are not.” Similarly, in the case of Kent Ro Systems Ltd & Anr vs Amit Kotak & Ors, the Court observed, “to require an intermediary to do screening would be an unreasonable interference with the rights of the intermediary to carry on its business.”

The new code provides that intermediaries will have to proactively monitor and filter their users’ content through appropriate mechanisms, and be able to trace the originator of questionable content to avoid liability for users’ actions. It also distinguishes between a significant and a regular social media intermediary as per their user traffic. Social media companies will have to establish a proper Grievance Redressal Mechanism to deal with user complaints. Further, the platforms will have to disclose the first originator of a mischievous tweet or message as the case may be. Additionally, intermediaries are required to provide assistance to government agencies, who must clearly state in writing or electronic means the purpose of seeking such information within 72 hours. Upon notification, an intermediary will have 24 hours to remove or disable unlawful content.

Section 79(1) of the IT Act grants safe harbour protection to the intermediaries for any kind of third-party content. This section grants an immunity to the intermediary irrespective of the content under the due diligence doctrine. Section 79(2) provides that the immunity is afforded upon an intermediary who has neither knowledge nor control over the information transmitted or stored. They are under a mandate to remove any content under a ‘notice and takedown’ regime as per Section 79(3). It requires an intermediary to remove information that does not adequately fulfil the test of being lawful upon receiving “actual knowledge”. However, according to Rule 3(4) Intermediary Guidelines, 2011, the intermediary can be made directly liable for its inability to remove the unlawful content which was being stored and perhaps transferred through its platform.

In Europe, legal discourse to address misinformation and disinformation began with the 2017 EU resolution on “Online Platforms and the Digital Single Market”. This later formulated into a High-Level Group to “advise on policy initiatives to counter fake news and the spread of disinformation online”. In turn, this facilitated the “Action Plan Against Disinformation,” and eventually in 2018, Codes of Practice on Disinformation, a voluntary self-regulatory commitment comprised of “signatories” representing multiple high profile technological companies was developed. While these codes provide helpful principles and guidelines, they are self-regulatory and voluntary and not legally binding measures.

The rules of the new code make it mandatory for a ‘significant social media intermediary’ that provides information primarily in the nature of messages (such as WhatsApp or Twitter) to enable the identification of the “first originator” of the information. This is a move aimed at tracking down people who indulge in circulation of fake news or carry out illegal activities, however, this will require the companies to break end to end encryption provided to the users. Such a requirement can affect user experience in India, by exposing users to cybersecurity threats and cybercrimes. The exposition of the “first originator” also brings in serious questions on the right to privacy.

The right to privacy is founded on the autonomy of the individual. The Apex Court in K.S. Puttaswamy case held that the right to privacy is a fundamental right. Moreover, one of the major concerns in India is the lack of a Data Protection law and it still being in the pipeline there is no mechanism to protect personal data. In such a scenario, protection of personal data being an intrinsic right under Article 21, it becomes the most vital duty of protecting the right in question in absence of proper legislation to that effect.

It is also pertinent to note that, section 69(1) of the IT Act already empowers the Central and State government to intercept, monitor or decrypt any information through computer resource only for the reasons provided under Article 19(2) of the Indian Constitution. Further, Section 69A allows the Centre to issue directions for blocking public access of any information through any computer resource.

Union Minister Ravi Shankar Prasad at the press conference while introducing the new Code reiterated that the right to internet is not a fundamental right itself, but it is a fundamental mechanism to realise other fundamental rights enshrined under Part III of the Constitution. This structure enables the Government to take substantial measures to proportionately and legitimately regulate the mechanism. While concerns have been raised on issues like decrypting end-to-encryption, the privacy of users and non-consultation with the stakeholders, these issues are expected to be resolved with the policy coming into the implementation stage. Considering the socio-economic impact of digital technologies, with a specific legislative mechanism in place, the new legal framework will ensure enforcement of rule of law and lead to a balanced digital ecosystem.

Tags: