• Thursday, April 25, 2024
businessday logo

BusinessDay

NITDA Code of Practice – potential impact on online platforms and social media

NITDA Code of Practice – potential impact on online platforms and social media

On 13th June 2022, the National Information Technology Development Agency (NITDA) issued the draft Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries (“the Code”).
The objectives of the Code include setting out best practices for online platforms and making the digital ecosystem safer for Nigerians and non-Nigerians in Nigeria. The Code is also expected to set out measures to combat harmful online information and adopt a co-regulatory approach toward implementation and compliance. The Code thereafter sets out provisions across six parts to achieve these objectives.

According to NITDA, the Code was developed in collaboration with the Nigerian Communications Commission (NCC) and the Nigerian Broadcasting Commission (NBC), with input from “interactive computer platforms”s such as Twitter, Facebook, WhatsApp, Instagram, Google, and TikTok. NITDA further stated that the Code is aimed at “protecting the fundamental human rights of Nigerians and non-Nigerians living in the country, as well as defining guidelines for interacting in the digital ecosystem”.
As expected, Nigerians have been distrustful of the Code with many concluding that it is an attempt by the Nigerian government to regulate social media and quash freedom of expression. This is understandable considering the antecedents of the Nigerian government when it comes to its posturing regarding social media platforms.
For instance, in 2019, there was the Social Media Bill that was before the National Assembly by which the government was exploring ways of curbing the perceived excesses of social media users. That ill-fated Bill was then closely followed by the Prohibition of Hate Speeches Bill (“Hate Speech Bill”). When the public outcry regarding both Bills became resounding, they were stepped down.
However, in 2021, following Twitter’s deletion of a Tweet posted by the President of the Federal Republic of Nigeria, Twitter was banned for several months with users in Nigeria unable to directly access the platform, with many users resorting to using virtual private networks (VPNs) to access the microblogging site.
Still yet in 2021, there was an attempt by the Nigerian government to amend the National Broadcasting Commission Act to empower NBC to regulate social media platforms.
It is against this background that we examine the provisions of the Code and determine if it is indeed a tool designed to restrict free speech in Nigeria.

What entities are affected by the Code?
It is pertinent to determine early enough who would be affected by the Code.
The following entities are expected to comply with the Code:
1) Interactive Computer Service Platforms (Platforms) – the Code defines these as “any electronic medium or site where services are provided by means of a computer resource and on-demand and where users create, upload, share, disseminate, modify, or access information, including websites that provide reviews, gaming platforms, and online sites for conducting commercial transactions”.

The inference drawn from this definition of interactive computer service platforms is that it would cover online platforms such as companies’ websites, fintechs, gaming companies, edtechs, healthtechs, e-commerce platforms, social media platforms and other service providers that offer goods and services through their platforms.

2) Internet Intermediary (Intermediary) defined in the Code as including, “but not limited to, social media operators, websites, blogs, media sharing websites, online discussion forums, streaming platforms, and other similar oriented intermediaries where services are either enabled or provided and transactions are conducted and where users can create, read, engage, upload, share, disseminate, modify, or access information”.

This definition captures a number of companies already covered under interactive computer service platforms. It includes streaming platforms (like Netflix, YouTube, etc) social media platforms, internet service providers, e-commerce intermediaries, fintechs, etc.

3) Large Service Platforms (Large Platforms) – defined as “an Interactive Computer Service Platform/Internet Intermediary whose users are more than one hundred thousand (100, 000)”.
This simple definition indicates that Platforms and Intermediaries (collectively referred to in this article as “online platforms”) that have more than one hundred thousand users are classified as Large Platforms.

Commendable provisions

The Code contains some commendable provisions such as the provision mandating the removal of non-consensual sensual contents, provisions addressing contents harmful to a child, provisions introducing a notice-and-take-down regime and provisions with respect to online platform rules.

Item 1 of Part II also promotes equal distribution of information for Nigerian users.

Concerns with the Code

The Code contains certain provisions that may be used by an abusive government to curtail or infringe on free speech. Indeed, the three major areas of concern under the Code with respect to restricting free speech are the provisions allowing the Government to order the removal of content, the provisions mandating online platforms to proactively remove false information likely to cause public disorder and the provisions requiring local incorporation of online platforms. The major areas of concern and other notable areas are examined below.

Mandatory incorporation of foreign online platforms: The Code imposes additional obligations on Large Platforms which include an obligation to be incorporated in Nigeria, having a physical contact address in Nigeria and appointing a Liaison officer for communications with the government.

It is likely that Large Platforms that do not carry out the obligations set out above would be prevented from operating in Nigeria.

The first problem with this position is the sheer number of platforms that will be classified as Large Platforms. This is because the 100,000 user threshold is extremely low and can be contrasted with the 45 million active monthly users within the jurisdiction threshold for Very Large Online Platform under the DSA.

Read also: Nigerians react as NITDA issues conditions for online operations

Similarly, it is not compulsory for platforms to be locally incorporated or have local addresses under the DSA as an appointment of a Legal Representative typically suffices. It should also be noted that under certain situations, NITDA may also require an online platform with less than one hundred thousand users to comply with the obligations of a Large Platform.

Takedown of content: An online platform is required to take-down a content within 24 hours of receiving notice from an Authorised Government Agency (“Agency”) of the presence of unlawful content. Unlawful content is defined under the code to mean any content that violates an existing law in Nigeria.

However, the problem is that the Agency is not required to specify how or why the content is unlawful and the online platform is not given the time or avenue to verify the unlawfulness of the content -particularly, where it is unclear whether the content is in fact unlawful.

The position under the Code can be contrasted with the position under the German Network Enforcement Act where content must be manifestly unlawful and the position under the French “Lutte contre la haine sur internet” (“Fighting hate on the internet”) law where the content must be patently illegal before a takedown within 24 hours is required.

In addition, under the proposed EU Framework, the relevant Agency or Court giving an order to take down content is required to, among others, provide a statement of the reasons explaining why the information is illegal by reference to the specific provision of the law infringed.

Removal of false information likely to cause public disorder: In addition to creating a general obligation to monitor, this provision creates a multitude of vagueness which can be easily exploited by an abusive Government as the Code neither defines false information nor public disorder. Consequently, for example, Platforms like Twitter, Instagram etc. may be sanctioned if they fail to proactively take down posts related to the shootings at Lekki and other similar incidents.

This position can be contrasted with other frameworks like the EU E-Commerce Directives and the DSA where states are prevented from imposing a general obligation to monitor as well as countless case laws requiring that any legislation attempting to restrict free speech must be sufficiently precise to enable the citizen to regulate his conduct: he must be able – if need be with appropriate advice – to foresee, to a degree that is reasonable in the circumstances, the consequences which a given action may entail.

Indeed, there are other areas of concern under the Code. For example, the Code does not contain provisions for a review of contents taken down. In addition to taking down unlawful content, the Code also requires all online platforms to take all reasonable steps to ensure that such content stays down.

Consequently, it is conceivable that an Agency can claim that content is unlawful and the intermediary will be forced to not only take down the content but also take steps to prevent the content from resurfacing. The only option available to affected Nigerians will be to approach the Court for a declaratory judgement that the content was not in fact unlawful.

Similarly, the definition of prohibited material to mean content or information objectionable on the grounds of public interest, morality, order, security, peace, or is otherwise prohibited by applicable Nigerian laws is another potential window of abuse. For purposes of clarity, the mere fact that content is objectionable (without being unlawful) should not be enough ground to remove such content, especially in light of how loosely the term “objectionable” can be interpreted.

In addition, the absence of an internal complaint-handling system under the Code can be contrasted with the framework under the DSA where Users can lodge a complaint against the decision to remove content or suspend a User.

Finally, it appears the Government is trying to regulate 5 different categories of information namely misinformation, disinformation, harmful content, unlawful content and prohibited material. It is therefore very easy to conceive a situation where an enthusiastic government will be able to fit unfavourable content into at least one of these five categories and consequently take steps to remove the content.

Recommendations
Amongst many other recommendations, we suggest that the negative provisions identified above be amended to introduce safeguards or outrightly removed to obviate the possibility of abuse by an abusive government and introduce a measure of protection for the general public and the affected Platforms.

We further recommend the following:
a) all provisions imposing general monitoring obligations be removed;
b) the introduction of a notice-and-action framework; and
c) the introduction of a suspension mechanism for persons/Agencies who are shown to frequently submit notices or complaints that are manifestly unfounded.

We also recommend that time be taken to study how other jurisdictions are able to effectively manage social media and hate speech so that we do not have a Code that does more harm than good to the digital ecosystem.