Modern Platforms, Outdated Laws

Modern Platforms, Outdated Laws

Digital intermediary platforms remain largely unregulated in India. With a recent increase in profitable content that is sexual in nature, is the law doing enough to handle these modern challenges?

Introduction

A Public Interest Litigation (PIL) recently filed in the Supreme Court of India has drawn attention to an alarming trend: the rise of obscene content on platforms that operate under the guise of social media intermediaries. According to a 2023 press release by the Ministry of Electronics and Information Technology (MEITY), several social media platforms such as Telegram and X have been exploiting their classification under the Information Technology (IT) Act to function as quasi-pornographic websites. While they are referred to as social media apps, they primarily distribute sexually explicit content, dominate app store charts, and generate significant revenue.

Legally, these platforms fall under the category of a ‘social media intermediary’ as defined under Section 2(w) of the IT Act. This section defines an intermediary as any person who, on behalf of another person, receives, stores, or transmits electronic records or provides any service with respect to that record. This classification places social media intermediaries under the protection of the safe harbor provision under Section 79, meaning they are not held liable for content uploaded by users, as long as they act as neutral third parties. However, many of these platforms are filled with content that violates Indian obscenity laws, and this classification has allowed them to evade responsibility due to gaps in the legal framework.

This article examines the shortcomings of the existing domestic legal framework in handling modern challenges posed by these platforms. It argues that the current laws do not account for the covert ways in which new social media platforms function, such as facilitating the sale of pornographic material and promoting obscene content. As such, the 20th-century understanding of intermediary liability is no longer sufficient to address the activities of these platforms. The discussion begins by exploring the legality of such platforms in India, followed by an analysis of the strategies these platforms use to exploit gaps in the law for financial gain. Finally, the article advocates for the adoption of a facilitative threshold for online intermediary liability to address these modern tactics and ensure that Indian legislation is up to date with the current digital landscape.

How Pervasive are these Platforms?

The ban on TikTok in 2020 left a significant gap in India’s video-sharing and communication app market, which was quickly filled by platforms promoting sexually explicit content. These platforms, such as Bigo Live and Chingari, while benefiting from the legal protection provided by Section 79, actively encourage the dissemination of obscene content, violating the country’s obscenity laws.

For example, a recent study by the Economic Times revealed that some of the top-grossing apps on Google Play operate on a model where content creators post videos and photos in exchange for fees or ‘gifts’ from their audience. Bigo Live, a video chatting app categorized for users 12 years and older, has been found to host illicit videos. Similarly, Chingari, an Indian platform, has been criticized for enabling digital prostitution. These platforms often lack proper age restrictions, allowing minors to access explicit material with disturbing ease.

Another example is the platform OnlyFans, which has gained international notoriety for hosting sexually explicit content. In India, OnlyFans operates in a legal grey area, acting as a pornographic website while presenting itself as a legitimate platform for content creators. Telegram has also been implicated in hosting the dissemination of child sexual abuse material (CSAM) through encrypted channels. Reports indicate that Telegram has been used to spread rape videos, offering incentives for individuals to engage in sexual violence.

These platforms remain unregulated within India’s domestic market and capitalize on the absence of clear liability criteria and regulations. They encourage a profitable trade of sexual content under the guise of ‘content creation,’ often bypassing legal scrutiny due to the loopholes in the current laws.

Determining the Greyness of These Online Intermediaries

Under Indian law, obscenity is defined through the Aveek Sarkar vs. State of West Bengal case, in which the Supreme Court held that “the question of obscenity must be seen in the context in which the photograph appears and the message it wants to convey.” Applying this principle, it becomes clear that content uploaded on platforms like Bigo Live, Chingari, and OnlyFans is inherently pornographic. Unlike artistic or cultural expressions that are protected by freedom of speech, this content is designed explicitly to promote sexual content for financial gain, and therefore constitutes a violation of obscenity laws.

Section 294 of the Indian Penal Code (IPC) further classifies obscene content as anything that is “lascivious” or “tends to appeal to prurient interests.” The nature of the content on these platforms, combined with their aggressive marketing tactics such as provocative push notifications and advertisements like “a sexy girl is waiting for you,” clearly fits this definition of obscenity. These platforms not only host explicit content but actively promote it through advertisements, thus crossing the line from simply hosting illegal material to actively marketing it.

The Department of Telecommunications (DoT) issued a directive on July 31st that reaffirms the Indian government’s stance on adult content, labeling it as a violation of decency norms. While this order primarily concerned pornographic material, it would logically apply to these intermediary platforms, as they serve as substitutes for traditional pornographic websites. The government’s position suggests that the use of such platforms in India constitutes a violation of obscenity laws.

The crux of the problem lies in the tactics these platforms employ to avoid scrutiny while normalizing harmful behaviors. These platforms often conceal explicit content behind vague advertising or direct users to private paywalls, making it difficult to establish clear-cut illegality.

Evasive Advertising Strategies

One of the most insidious aspects of these platforms is their use of evasive advertising strategies. They often exploit other social media platforms, which are relatively benign, to promote their services. Influencers, with large followings, act as informal marketers for these platforms, violating government guidelines on influencer advertising. A common strategy, known as “breadcrumbing,” involves using legally compliant platforms to direct users to adult content sites such as OnlyFans, Bigo Live, and Chingari. This tactic allows these platforms to bypass direct regulatory oversight while expanding their user bases through seemingly legitimate channels.

Reports have shown that social media platforms, such as Meta, play a role in promoting child sexual abuse material (CSAM) and other sexual content, which directly correlates to the promotion of illicit platforms like Telegram and OnlyFans. Stanford University research has revealed that Instagram’s algorithms have connected paedophilic accounts to accounts selling underage sexual content, further enabling the growth of these illegal markets.

Additionally, these platforms’ advertising practices often blur the line between legality and illegality. They violate the Advertising Standards Council of India (ASCI) Code, which prohibits indecent or vulgar content in advertisements. Chapter II of the ASCI Code states that advertisements should not contain content that goes against the prevailing standards of decency, and guideline 7 prohibits advertisements featuring the sexual objectification of persons. Yet, these platforms routinely feature advertisements that promote adult content alongside corporate advertisements, flagrantly violating these guidelines.

These video-chat apps, after installation, engage users with sexually explicit notifications such as “A sexy girl is waiting for you, go match!” and bombard users with explicit imagery. These notifications, coupled with the obscene content provided by the apps, clearly satisfy the thresholds for obscenity but evade regulatory scrutiny. This is due to the fact that such advertising content is only visible to users who have already consumed explicit content, making it harder to identify and regulate.

Facilitative Thresholds

The current classification of online intermediaries under Section 2(w) of the IT Act presumes them to be neutral facilitators, overlooking the substantial control they exercise in publishing and promoting pornographic content. This classification has proven inadequate in addressing the active role these platforms play in the promotion of illegal content. As a result, they should no longer be considered passive third parties but rather held accountable for their involvement in facilitating such illegal activities.

While digital activities on these platforms would typically fall under the purview of Sections 67, 67A, and 67B of the IT Act, determining intermediary liability is increasingly difficult due to the vast amount of data involved and the inconspicuous nature of these activities. The broad definition of social media intermediaries and the legal protection they enjoy under Section 79 of the IT Act create a regulatory gap, allowing platforms like OnlyFans to claim ignorance or passivity in the face of illegal content.

One potential solution is to adopt a facilitative liability threshold similar to the one used in the European Union (EU) Directive. This would hold intermediaries accountable based on their level of activity and involvement in the promotion of illegal content. The EU Directive sets two criteria for intermediary liability: (1) the provider must not have knowledge of the illegal activity, and (2) upon receiving such information, the provider must act expeditiously to remove the illegal content. This principle, known as the “actual knowledge” principle, allows for a more proactive approach to regulating online intermediaries, ensuring that they are held accountable for their role in promoting illegal content.

By adopting this model, India can align its regulatory framework with global standards and ensure that emerging platforms are effectively governed. It would shift the burden of responsibility from the state to the intermediaries, making them more accountable for the content they host and promote.

Conclusion

Modern intermediary platforms exploit ambiguities in India’s legal framework to perpetuate and profit from illegal activities, particularly the commodification of obscene content. Through their evasive advertising strategies and manipulation of social media influencers, these platforms create an illusion of legitimacy while systematically fostering and monetizing illegal content. The root cause of this problem lies in the passive liability framework under the IT Act, which shields platforms from responsibility unless they fail to act upon receiving notice.

To address this issue, India should adopt a facilitative liability threshold similar to the EU Directive, which holds intermediaries accountable for their active or negligent involvement in promoting illegal content. Until such a framework is implemented, platforms that promote sexual content will continue to evade scrutiny, posing a serious risk to public decency and safety.

Elijah Blackwood 6 Posts

Elijah Blackwood is known for his dark and atmospheric tales that blend elements of horror and the supernatural. His prose is hauntingly beautiful, drawing readers into eerie worlds where the line between reality and nightmare blurs.

0 Comments

Leave a Comment

500 characters remaining