• Thursday, December 19, 2024
businessday logo

BusinessDay

The case for stricter regulations: Holding online platforms accountable for user-generated content

The case for stricter regulations: Holding online platforms accountable for user-generated content

The digital age has brought unprecedented connectivity and freedom of expression, enabling billions to share ideas and content globally. However, this freedom has also turned online platforms into conduits for illegal activities, including child pornography, terrorism, and hate speech. The ongoing trial of Telegram CEO Pavel Durov in Paris starkly highlights the challenges governments face in holding online platforms accountable for user-generated content. Durov, a Russian-born billionaire with French citizenship, faces criminal charges for allegedly allowing the spread of child pornographic content and other illegal activities on Telegram, a messaging app with over a billion users. This case has reignited the debate over whether it is legally and morally plausible to hold online platform owners accountable for their users’ actions and whether stricter regulations are needed to curb the misuse of these platforms.

The charges against Durov are significant not only because they target the head of a major global platform but also because they represent a shift in the approach to regulating online content. Traditionally, platform owners have been seen as intermediaries, providing the infrastructure for communication but not directly responsible for the content shared by users. However, as platforms like Telegram have grown, the argument that they are merely neutral hosts has become increasingly untenable. The French authorities’ decision to prosecute Durov directly reflects a growing recognition that platform owners must take greater responsibility for preventing the spread of illegal content, especially when it comes to protecting vulnerable individuals, such as children.

This trial also underscores the broader international perspective on platform accountability. The European Union (EU) has been at the forefront of efforts to regulate digital platforms, focusing on balancing the need for free speech with the responsibility to prevent harm. The EU’s upcoming Digital Services Act (DSA) is a comprehensive framework that aims to hold platforms more accountable for the content they host. Under the DSA, large platforms like Telegram will be required to implement measures to prevent the spread of illegal content, including child pornography, and face significant penalties for non-compliance. The Act also emphasizes transparency, requiring platforms to disclose their content moderation practices and decisions. The case against Durov can be seen as a precursor to the type of enforcement actions that may become more common under the DSA, where platform executives are held accountable not just as representatives of their companies but as individuals.

In contrast, the legal landscape in Nigeria is still evolving, particularly in terms of holding platform owners accountable for user-generated content. The Nigerian Cybercrimes (Prohibition, Prevention, Etc.) Act of 2015 is the primary legislation addressing cybercrimes in the country. While the Act imposes severe penalties for cybercrimes, including the distribution of child pornography, it does not explicitly hold platform owners liable for user-generated content. However, as the digital space in Nigeria continues to expand, there is a growing call for more robust regulations that address the responsibilities of platform owners. The Telegram case could catalyze Nigerian lawmakers to revisit the Cybercrimes Act and consider amendments that would impose greater liability on platform owners for the content shared by their users.

The central question in this debate is whether it is legally and morally plausible to hold online platform owners accountable for the actions of their users. Legally, the challenge lies in defining the extent of a platform owner’s liability. Should they be held responsible for every piece of content uploaded by users, or only when they are aware of illegal activity and fail to act? Different jurisdictions have approached this question in various ways, but the trend is moving towards greater accountability. In the EU, for example, the DSA will require platforms to take proactive measures to prevent the spread of illegal content, shifting the burden of responsibility onto platform owners. In Nigeria, the focus has traditionally been on prosecuting the individuals who create or share illegal content, but there is growing recognition that platform owners also have a role to play in preventing harm.

Morally, the case for holding platform owners accountable is strong. Platforms like Telegram benefit enormously from user-generated content, which drives traffic and revenue. With this benefit comes the responsibility to ensure that their platforms are not used for harmful purposes. The argument that platforms are merely neutral hosts is increasingly difficult to sustain, particularly when the content in question involves egregious violations of human rights, such as child exploitation. Holding platform owners accountable could serve as a powerful deterrent against the proliferation of illegal content online, as executives would have a direct incentive to invest in the necessary safeguards.

The trial of Pavel Durov also raises important questions about the ethical responsibilities of platform owners in the digital age. As the head of a platform with over a billion users, Durov wields immense power and influence. With this power comes a duty to protect the vulnerable and prevent the misuse of his platform. The argument that platform owners cannot control everything that happens on their sites is valid to an extent, but it does not absolve them of the responsibility to take proactive measures to prevent harm. This includes investing in technology to detect and remove illegal content, cooperating with law enforcement, and creating clear policies that deter users from engaging in criminal activities.

The push for stricter regulations and greater accountability faces challenges, notably the potential impact on free speech. Online platforms have traditionally been havens for free expression, but stricter regulations risk over-censorship, where platforms might remove not just illegal content but also controversial or sensitive material, potentially stifling legitimate discourse. The global nature of the internet adds complexity, as content can be shared across borders, creating jurisdictional challenges. Laws governing online content differ widely, complicating the enforcement of consistent standards, as illustrated by the case of Telegram’s Pavel Durov being prosecuted in France despite the platform’s global reach.

Despite these challenges, stricter regulations and greater accountability are necessary. The case against Durov underscores the need for a global approach to online content regulation, one that balances free expression with the responsibility to protect users from harm. The digital age demands new legal and ethical frameworks that reflect our interconnected world. As the debate over platform accountability progresses, collaboration among lawmakers, platform owners, and society is essential to creating a safer and more responsible digital environment.

This article is written by Ugochukwu Obi, a Partner for ICT, Fintech, and Private Equity at Perchstone & Graeys, LP

Join BusinessDay whatsapp Channel, to stay up to date

Open In Whatsapp