Role and liability of the internet intermediaries: An analysis

As we all know that Social media platforms are like a friendly postman in today’s world. The internet has revolutionised how we interact, but it has also introduced a slew of new problems, including hate speech, terrorist recruitment, fake news, illegal campaigning, and identity theft. The majority of the issues are related to the internet’s […]

by Dishi Mishra and Devansh Mishra - November 19, 2021, 6:12 am

As we all know that Social media platforms are like a friendly postman in today’s world. The internet has revolutionised how we interact, but it has also introduced a slew of new problems, including hate speech, terrorist recruitment, fake news, illegal campaigning, and identity theft. The majority of the issues are related to the internet’s new regulators. One of the important variables affecting innovation and free expression is whether and when communications platforms like Google, Twitter, and Facebook are accountable for their users’ online behaviour. The motive behind this article is to bring the whole concept of liability of intermediary in breach of one’s basic rights either private or public; in different cases, and applicable regulations for such infringement.

WHAT INTERMEDIARY MEANS

To understand all the above-mentioned statements, firstly we need to answer this question and then what is the role of theintermediary in above stated context. An intermediary is a person who acts as a link between two parties to help them communicate. It is a non-profit organisation that assists individuals and groups in reaching an agreement by serving as a conduit for communication. It serves as a go-between for two parties. It communicates a message or a proposition between two organisations or individuals. For example, A writes a letter for B and send it through postman to B, here in this scenario postman is working as an intermediary between A and B. Let’s take another example of a shop owner, when A purchases an item from his shop and the shop owner has bought the same good from the manufacturer then in this case shop owner is working as an intermediary or link between manufacturer and A. We can find many more such example of intermediaries in our daily life.

In this age of internet social media platforms are not only used for entertainment purposes but they also act as an intermediary between the users. The word intermediary is often mentioned in discussion about social media platforms now-a-days. Section 2 (w) of the Information Technology Act,2000 (IT Act, 2000) defines Intermediaries as – “intermediary, with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes Telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-market places and cyber cafes.”

WHAT DO WE MEAN BY LIABILITY OF INTERNET INTERMEDIARIES?

With great power comes great responsibility, and social media platforms have a lot of power, so it stands to reason that they have a responsibility to monitor the content that falls under their purview. There was already debate about the role of Internet intermediaries when internet firms and private parties began to offer Internet services. Initially, the provider was viewed as the digital equivalent of a postal service, in that it had no knowledge of or control over the mail it delivered and thus could not be held liable for any illegal content. But in the present-day scenario there is a debate whether social media intermediaries should be liable for the same or not?

The state of being legally responsible for causing damage or injury is defined as liability. “Internet intermediary liability” refers to the legal duty (“liability”) of intermediaries for unlawful or harmful activity carried out by users through their services. The term “liability” refers to the obligation of an intermediary to prevent illegal or harmful behaviour by users of their services. Failure to do so may result in legal orders directing the intermediary to act, as well as civil or criminal legal action against the intermediary, depending on the applicable law. In a lot of countries social media platforms are provided safe harbour because social media platforms cannot be reasonably expected to monitor all the posted content on the platform.

Under the concept of intermediary liability, governments and individuals can hold technology intermediaries accountable for unlawful or harmful content created by users of those services. Copyright infringements, trademark disputes, cybercrime, defamation, hate speech, child pornography, “illegal content,” offensive but legal content, and privacy protection are just a few of the situations in which intermediary liability may arise.

Liability of internet intermediaries may influence internet users. On the one hand, the quality and range of products or services available to them may be limited because intermediaries are unwilling to risk liability for service innovation, which may jeopardise providers’ rights to freedom of expression; on the other hand, it may cause harm to the general public, for example, if any social media platform has provided any child pornography content or defamatory content for any person, it may harm that child’s basic humanistic right and the right to life.

EVOLUTION OF RULES

Safe harbour provisions for intermediaries are included in India’s Information Technology Act of 2000 (“IT Act”) and its related rules. Since their implementation, the regulations have evolved dramatically, partly due to changes to the underlying statute and rules, and partly due to the Court’s interpretation of these provisions. In this section of the paper, we’ll examine the progression of this evolution, emphasising the benefits and risks we see in current and previous laws.

The Information Technology Act of 2000 kicked off the journey of the intermediate guideline. In its original form, the IT Act provided little to no Safe Harbour protection to internet intermediaries. To begin with, the term “intermediaries” was defined as any entity that receives, stores, or transmits any electronic message on behalf of another person, or provides any service linked to such a message.

Even those intermediate who fit this narrow definition were only shielded from the IT Act’s list of offences, not from any other laws. Under these limited provisions, an internet middleman has little to no protection. Under this regulation, an intermediary might be held accountable for material it did not originate but only provided a venue for publishing or transmitting. The possibility of such liability constituted a threat to the growth of the e-commerce industry.

As a result, the Information Technology (Amendment) Act of 2008 was enacted, which expanded the scope of Safe HarbourEffective protection to intermediaries, among other things. The definition of an “intermediary” has been expanded to include “any person who receives, stores, or transmits any particular electronic record or offers any service with respect to that record on behalf of another person.” According to this definition, telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online auction sites, online market places, cyber cafes, and interactive websites like Facebook, Twitter, Blogspot, and WordPress are all considered intermediaries.

In 2008, India entered the fray by modifying the Information Technology Act of 2000 to establish a “notice and takedown” method to minimise intermediary responsibility. On April 11, 2011, the Indian government announced the “Information Technology (Intermediaries Guidelines) Rules 2011” (hereafter referred to as “rules”), which establishes, among other things, guidelines for intermediary takedown management.

The take down notice concept is included in Section 79 of the Information Technology Act of 2000 (the “IT Act”). Section 79 outlines the standards for safe-harbour protection (2). One of these responsibilities is to follow the recommendations set by the federal government from time to time. The Information Technology (Intermediary Guidelines) Rules, 2011 (the “Intermediary Guidelines”) are the basic rules that intermediaries must follow in order to be protected by safe harbour. The due diligence standards that intermediaries must achieve in order to be eligible for protected status are outlined in these rules. It may, for example, impose restrictions on intermediaries’ ability to host particular types of content on its platform (e.g. obscene content). The Draft Rules include a new category of information that is prohibited: content that endangers “public health or safety.” Intermediaries must provide help to any government agency within 72 hours.

INFORMATION TECHNOLOGY (INTERMEDIARY GUIDELINES) ACT OF 2021

This amendment provided two different set of rules; one category applicable to all social media intermediary and the other one for those who have more than 50 lakh users on their platform.

A handful of the most important regulations from the common set of rules for all intermediates are listed here. All intermediaries must conduct due diligence, and all intermediaries must have a grievance redressal procedure in place to address the first-hand complaints of their users. The task of guaranteeing online safety and dignity of users, particularly women users, has been placed on intermediaries, and this has necessitated the implementation of a voluntary user verification method by intermediaries. Most important aspect given under the guidelines as intermediary will be held liable if it won’t be able to remove or disable access to the unlawful content in the eyes of law for e. g. if any intermediary has published any prohibited book by the government orders or by any court order and the said intermediary is unable to remove that book from its platform then that platform will be held liable its act.

According to Rule 4(1), a “significant social media intermediary”, i.e., a social media intermediary having more than 5 million registered users in India shall be liable in any proceedings relating to the given below regulations; Significant social media intermediaries must appoint a chief compliance officer, a nodal contact person and a resident grievance officer. All of these officers should be resident in India. The chief compliance officer shall be responsible for ensuring compliance with the act and the rules. Nodal contact person will be responsible for 24/7 coordination with law enforcement agencies. Resident grievance officer shall perform the functions mentioned under grievance redressal mechanism. Significant social media intermediaries will publish a monthly compliance report mentioning the details of complaints received and action taken on them as well details of contents removed proactively by the significant social media intermediary. Significant intermediaries providing primarily messaging service shall enable identification of the first originator of the informationand the intermediary shall not be required to disclose message or any other information of the first originator.

NEED FOR NEW GUIDELINES

The Indian government issued a notification welcoming social media sites to do business in India, but they must adhere to the country’s constitution and laws. According to the government’s announcement, social media platforms have empowered regular users and are occasionally used to ask questions of the government, but they require accountability in the event of misuse. The new IT laws were designed to empower regular users by providing a proper redress mechanism and prompt resolution of their complaints.

India is second largest online market in the world. WhatsApphas over 53 crore users, YouTube has over 45 crore, Facebook has around 41 crore, Instagram has around 21 crore users, and Twitter has over 1.75 crore users. The research suggests that the use of social media has become a movement in India. In a suomoto writ suit (Prajwala v. Union of India) judgement dated December 11, 2018, the Supreme Court of India stated that the government of India may formulate required guidelines to remove child pornography, rape, and gang rape images and videos from social media platforms. These components of the offences were intended to be covered by the IT guidelines.

Provided IT rules intended to curb the spread of fake news, morphed image of women, revenge porn, defamatory content, content hurting religious sentiment of general public, misuse of social media by criminals, anti-national elements, use in recruitment of terrorist, incitement of violence, disturbance in public order by available content on the platform, and other cyber crimes.

SHORTCOMINGS OF THE NEW GUIDELINES

Many social media platforms offer end-to-end encryption to their users. This feature ensures that only the sender and receiver have access to the information sent between them. The government has demanded social media firms to provide information about the first originator under new IT guidelines, which will cause many social media networks’ end-to-end encrypted policies to be broken. As a result, this will be a violation of its users’ right to privacy.

Although the government produced a draught in which the general public was encouraged to provide feedback or suggestions before it was ratified as an act, only 171 responses from individuals and organisations were received due to a lack of communication or awareness among the public. There were also 80 counter-comments, which were taken into account when the rules were finalised. In a country with a population of 133 billion people, the government has received only 171 responses, implying two things: on the one hand, the government has not communicated with the public about the draught, and on the other side, the public has showed little interest in making ideas. The government has established a rule for keeping public order in this regard, for example, everything that they deem to be against the public order will be regulated in accordance with the established rules. Because we all know that the term “public order” is such a broad term, there is a great risk of it being misused.

CONCLUSION

The Indian government has taken a significant step forward by amending the Information Technology Act of 2011. It has resulted in numerous positive changes and protections for both the general public and intermediaries. For example, the Indian government has attempted to provide a balanced framework for the protection of the interests of various intermediaries on the Internet, as well as the rights of users of services provided by these intermediaries.

The laws, on the other hand, are still in their infancy. Holding intermediaries liable for user-generated content could lead to a situation where intermediaries limit the availability of content in order to avoid liability. For example, if a social media platform deletes content posted by a person that is derogatory or causes harm to the general public, that person’s right to freedom of speech and expression is violated under Art. 19(a). There are many different aspects to the new IT rules, some of which have a positive impact on one hand and a negative impact on the other. In a country like ours, people are less interested in knowing their rights, so it should be understood by the general public or the government should begin a campaign to educate people about it. Furthermore, vague phrases such as ‘user,’ ‘unlawful content,’ and ‘government order’ and ‘public order’ must be avoided in the regulations. Alternatively, the Act must define such terms itself. It is also possible to investigate the idea of applying different treatment to different types of intermediates.

The take down notice concept is included in Section 79 of the Information Technology Act of 2000 (the “IT Act”). Section 79 outlines the standards for safe-harbour protection (2). One of these responsibilities is to follow the recommendations set by the federal government from time to time.