An Unpleasant Present and a Troubled Past: The Need to Effectively Regulate TikTok

Introduction

While hearing an application for bail under S.439 of the Criminal Procedure Code (“CrPC”), the Odisha High Court on 28th May 2020 called for stricter regulation of the widely popular Chinese video-sharing application TikTok in the following terms:

“TikTok Mobile App, which often demonstrates a degrading culture and encourages pornography besides causing paedophiles and explicit disturbing content, is required to be properly regulated so as to save the teens from its negative impact. The appropriate Government has got the social responsibility to put some fair regulatory burden on those companies which are proliferating such applications.”

The petitioner in the instant case was a co-accused in a case involving the offence under S.306 (abetment to suicide) of the Indian Penal Code. The husband of the petitioner had committed suicide after watching certain intimate TikTok videos involving the petitioner and her ex-lover (the other co-accused). Highlighting the role of TikTok videos in instant suicide, the Court made the aforementioned remark.

This article aims to further the argument for stricter regulation of the video-sharing application, considering its troubled past. In doing so, it looks at the various available models for regulation of content, insufficiency of the Indian law and outlines a way forward.

Transnational Popularity

Before delving into the past controversies surrounding the app, it is important to look at the reasons behind the significant growth of the application in India and around the world. To put it statistically, as of April 2020, TikTok has about 800 million monthly active users. Further, in 2019 alone, India was responsible for 277.6 million downloads. How did the app transcend boundaries and emerge as a global hit? The foremost reason behind the success of the app is its flexibility to work according to user preferences. As Harvard Business Review points out, “Since little translation is required, TikTok reaches well beyond other successful Chinese apps such as Tencent’s messaging app WeChat, which is ubiquitous in China but mostly used elsewhere among Chinese communities keeping in touch with people back home.”

Another reason is its user-friendly interface combining click-bait news and entertainment with powerful AI to precisely match users rather than recommend content based on their viewing habits and “likes”. To specifically focus on the Indian context, the non-elitist nature and easy accessibility of the application has made it a huge hit amongst the people. To put it in simpler terms, it has appealed to the internal triggers of boredom and societal validation. By taking an action as simple as recording a video with a phone camera, people feel they can become “celebrities”.  Consequently, for some, it is also a career choice, a path to financial success.

A Troubled Past

India

Odisha HC’s criticism of the application is not the first instance of judicial reprimand. In April 2019, the Madras HC passed an order prohibiting the download and use of TikTok. The order was passed in furtherance of a series of concerns surrounding the application, as raised in the petition. The Court noted that:

Majority of the teens are playing pranks, gaffing around with duet videos sharing with split screen to the strangers. The children who use the said application are vulnerable and may expose them to sexual predators …. Without understanding the dangers involved in these kinds of Mobile Apps., it is unfortunate that our children are testing with these Apps.”

The Court also highlighted that the Indian legal framework is not equipped to deal with the situation unlike the United States, which has comprehensive legislation in the form of Children Online Privacy Act to guard against the cyberbullying of minors. In that light, the Court directed the Government to consider enacting a similar statute. While the Court was right in highlighting the concerns surrounding the app and pointing out the insufficiency of Indian law to tackle the situation, it is a no brainer that the Court went too far in its approach by banning the application altogether. The most astonishing part of the order was that without delving into substantive legal principles governing free speech, intermediary liability etc., the Court passed an order as extreme as banning the platform altogether.  [See Shreya Singhal vs. Union of India and Common Cause vs. Union of India, wherein the Court clarified that pre-broadcast or pre-publication regulation of content was not in the court’s domain and the role of a court or a statutory authority will come in only after a complaint is levelled against a telecast or publication]

TikTok vehemently argued against the ban, primarily contending that something which is legislatively allowed cannot be judicially banned. The potential impact of the order on free speech was also highlighted. To TikTok’s relief, the ban was lifted in the subsequent hearing.

Other Jurisdictions

It’s the right occasion to mention that transnational popularity comes at a cost. The app has faced legal problems in multiple jurisdictions. To elaborate a few; the app faced multiple class-action lawsuits in the United States for allegedly breaching child privacy laws and collecting the data of young users of the app. For instance, in late 2019, a 25 pages long complaint was filed on behalf of two minors by their mother scathing TikTok’s data collection and dissemination practices which have allegedly led to ramifications that include children “being stalked on-line by adults” [See: The Children’s Online Privacy Protection Act, 1999 (herein referred to as COPPA) which expressly forbids developers of “child-focused” apps from unlawfully obtaining personally identifiable information from children under 13 without first securing verifiable consent from their parents].

Interestingly, in the impugned case, a settlement was reached between the plaintiff and the defendant (the details of which haven’t been made public yet). Another lawsuit, centred around similar claims, was filed by a Californian student against TikTok. The sweeping proposed class action claims that the platform comes surreptitiously equipped with surveillance software that “vacuums” up user data and transfers the information to servers in China. In fact, the app has been the subject of a national security review in the United States.

In the very same year, the app faced similar troubles in the United Kingdom. Countries like Indonesia and Bangladesh have also imposed brief bans on the app in the past, amid public concern about illegal content such as pornography.

Regulating Models and Insufficiency of the Indian law

Evolution of the Debate

A consideration of the aforementioned ethical and legal problems with the app (and social media in general) in the broader context would point towards a simple deduction; the evolution of the debate regarding regulation of such applications.

Over the course of years, the debate has transformed from should such applications be regulated to how can such applications be regulated. The ‘should’ question stemmed out of the simple debate between free speech and its regulation by the state. However, as mentioned earlier, diversified ethical and legal issues raised by such platforms (ranging from violation of privacy to child pornography) changed the context. The question is now transfixed on the dual contours of balancing and efficacy i.e. how to regulate such platforms so as to ensure an appropriate balance between free speech and public interest and how effective regulating measures actually are in bringing down harmful content.

Existing models

To prevent a myopic view of the concerns surrounding platform/social media governance, a discussion on the broader regulatory concepts becomes extremely important. Regulatory models across the globe are typically categorized into 3 main categories: self-regulatory, limited governance and comprehensive governance. Each of these models is judged on the basis of its ability to actually achieve policy goals. Specific considerations for the regulation of social media platforms typically include issues of privacy protection, freedom of expression, and the maintenance of democratic institutions such as the electoral system.

As the name suggests, self-regulatory models basically encourage social media platforms to oversee their own operations and rectify their own shortcomings. Some self-regulatory measures taken by TikTok include the introduction of a four-step moderation process to review posts. In January 2020, the app also released a set of new, more detailed rules regarding the content it allows and prohibits. In the aftermath of the temporary ban imposed by the Madras HC, the app has also worked on its data security features and privacy policy by adding features such as Family Safety Mode which allows parents to control digital well-being feature of their teenage kids’ account remotely and device management features which allow users to take control of their accounts and decide who can engage with their content. But, is this model the best way forward?

The model is certainly better than the comprehensive governance model which raises pertinent issues about stifling of free speech and expression. However, the authors believe that the third model i.e. limited governance model (which can also be termed as the “collaborative model”), if drafted carefully and implemented efficiently, takes the edge over the self-regulatory model.

Insufficiency of the Indian Law

The catchword in the previous paragraph was “efficiency”. Although India has rightly adopted the limited governance model, it is in no way sufficient to combat legal and ethical concerns mentioned above. In the absence of any particular legislation, India has resorted to the application of specific legislations for specific offences on social media platforms. Rules for prohibiting content which is obscene, pornographic, libellous, invasive to other’s privacy [Rule 3(2)(b), Information Technology (Intermediary Guidelines) Rules, 2011] or harms minors in any way [Rule 3(2)(c), Information Technology (Intermediary Guidelines) Rules, 2011] are a step forward in the social media regulation regime. However, due to the safe harbour provision i.e. Section 79 of the IT Act, the intermediaries are not held accountable for any unlawful user-generated content in case they do not hold editorial control over it. Further, as laid down by the Supreme Court in Shreya Singhal vs. Union of India, a media platform can be made to take down a post or update only when an order from the government is issued. There is an imminent need to increase the collaboration between the Government and social media platforms like TikTok to regulate content effectively.

Draft Intermediary Rules, 2018 provide potential solutions to the aforementioned drawbacks. For example; Rule 3(9) requires social media platforms to deploy “automated tools … to proactively identify, remove or disable public access to unlawful information and content.” Further, Rule 3(5) states that “intermediary shall enable tracing out of such originator of information on its platform as may be required.” Another proposed rule requires an intermediary to warn its users at least once a month, of their need to comply with the intermediary’s terms of use. These rules, however, have been criticized for having an extremely wide scope and potential dangers to freedom of speech and expression.

Interestingly, the drawbacks and criticisms are not exclusive to the Indian legal framework. Other jurisdictions following the limited governance model such as Germany, which has enacted the NetzDG or Network Enforcement Act to regulate social media, have faced similar criticisms. The NetzDG act basically requires platforms to delete hate speech and other forms of illegal content within twenty-four hours of receiving notification of its existence on the service or risk financial penalties. Platforms must also report every six months about their processing of complaints. Naturally so, considering its potential for over-blocking content, the Act has been criticized for stifling free speech. Therefore, as mentioned in the preceding sections, the way forward consists of striking the perfect balance between increased regulation and freedom of speech and expression.

Conclusion

Repeated incidents concerning TikTok make one thing clear, the app needs stricter regulation. At the same time, any discussion on increased regulation is incomplete without extensive consideration of its effect on the valued right of free speech. The answer lies in finding the middle path. There is an imminent need to increase the collaboration between Government agencies and social media platforms to regulate content. Why? The reason is simple; the government does not have the sufficient technical know-how to effectively tackle all issues on its own and social media platforms too cannot be expected to keep track of millions of complaints by their own. Therefore, efficient co-regulation is the best step forward.


ABOUT THE AUTHOR

Anushka

Image

Anushka is a third-year B.A. LLB. (Hons.)  student at Dr Ram Manohar Lohiya National Law University, Lucknow. She has a keen interest in the areas of Dispute Resolution, Competition Law and Corporate Law. She can be reached at anushkasingh30@gmail.com. She is the co-founding editor at The Contemporary Law Forum <https://tclf.in/>.

Shashwat Awasthi

FB_IMG_1592835458802

Shashwat is a third-year B.A. LL.B (Hons.) student at Dr Ram Manohar Lohiya National Law University, Lucknow. He has a keen interest in the areas of Intellectual Property, Competition Law and Dispute Resolution. He can be reached at shashwatbifa03@gmail.com. He is the Founding Editor at The Contemporary Law Forum <https://tclf.in/>.

One response to “An Unpleasant Present and a Troubled Past: The Need to Effectively Regulate TikTok”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a website or blog at WordPress.com

%d bloggers like this: