• Friday, April 26, 2024
businessday logo

BusinessDay

Social media regulation: A look at NITDA’s Code

Social media regulation: A look at NITDA’s Code

One of the objectives of the NITDA’s Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries (“the Code”) as stated in the draft paper it recently issued is to set out the best practices that will make the digital ecosystem safer for Nigerians and non-Nigerians in Nigeria. Another objective of the Code is to combat online harms such as disinformation and misinformation. It is commendable that the government, in this regard, is conscious of its obligation to ensure the security and welfare of its people. However, such protective endeavours must toe the line of the law.

The creativity and opportunities that social media has enabled is tremendous – people can share diverse content like pictures, videos, and writings, in self-expression; and small businesses can connect with other users, and attract followers and customers. Brands and marketers use social media channels as part of their campaigns, marketing and customer engagement. However along with these benefits, came less palatable activities such as online bullying, polarisation, disinformation and misinformation, hate speech, foreign manipulation of national issues, among others. This has led to the rise in proposals for regulation by the governments of various countries, not only the use of social media but most especially its operators.

One of such is the UK Online Safety Bill proposed in 2020 which directs social media platforms, websites, apps and other services that host user-generated content to remove or limit the spread of harmful content or face a penalty of up to 18 million pounds (24 million dollars). In effect this moves them from passive to active participants in the protection of users.

Also, the government of India, in 2021, published a Code of Ethics for social media and streaming platforms – Intermediary Guidelines and Digital Media Ethics Code Rules 2021. This stipulates that social media platforms are not allowed to host, store, or publish any information prohibited by law in relation to the interests of the sovereignty or integrity of India, among other stipulations.

On the flip side, countries like Poland and Hungary also propose to make regulations for social media operators which will make it harder for social media companies to delete controversial content or user accounts. In Poland, the law would enable a government-appointed council to order the platform to restore user accounts and content that has been deleted by the platform for political reasons; failure to comply would result in fines up to 50 million zloty ($13.4 million).
In Nigeria, the new draft of the Code issued by the Agency intends to regulate the operations of social media platforms and intermediaries, to ensure they take measures against disinformation and misinformation.

Read also: Contractor’s Power to Suspend Performance of Works in Engineering and Construction Projects

This rise in social media regulation is also intensifying scrutiny of the societal impact of social media. Traditionally, the media was made up of newspapers, radio and television. Now social media platforms make up part of the media fabric and although these other media are largely regulated by laws, the regulation of social media is still controversial. This is because as much as its regulation is intended to curb excesses, finding the balance between that and upholding the rights of persons has proved difficult.

Another challenge in the regulation of social media is the fragmented nature of regulation around the world. Different countries have different rules with different criteria – a unified global approach to its regulation is unlikely, especially for geopolitical reasons. At best what we can have is a regional approach such as the European Commission’s proposed Digital Services Act – which aims to establish a consistent digital regulation regime across the European Union.

More than half of the world’s population use social media for different purposes including driving their businesses and expressing their views on various matters of social and political interest. A 2021 Digital Report by Datareportal states that there were 4.2 billion users globally as of January 2021- delivering a year-on-year growth of more than 13 percent.

This is equivalent to more than 53 percent of the world’s total population. This data includes people from different parts of the world, different in their perceptions, sentiments and reality. This is why different countries strive to establish their own regulations to better suit their circumstances. And this is what the Nigerian Agency proposes.

While a good part of the part of provisions of the Code is commendable, certain provisions of this new Code must be readdressed to ensure that it is not seen as a measure to stiffen or gauge freedom of expression or privacy and therefore ultimately affect the way people carry on enterprising activities.

The Code provides in Part One Section 5 that interactive computer service platforms and internet intermediaries (platforms) must disclose the identity of the creator of any information on its platform when directed to do so by a Court order.

An order of the court in this regard shall apply for purposes of preventing, investigating or prosecuting an offence concerning the sovereignty and integrity of Nigeria, public order, security, diplomatic relationships, felony, and incitement of an offence relating to any of the above or in relation to rape, child abuse, or sexually explicit material. Further, the Code provides that where the person who makes a post is located outside Nigeria, the first creator of that information in Nigeria shall be deemed the first creator.

While the intent of the provision may be to curb certain vices on social media, it may be going after the wrong person, especially someone whose intention is to bring such content to the awareness of the public for their safety.

People may not want to make certain posts any longer, especially when such posts are culled from a person who is not located in Nigeria. Seeing also that what drives engagement are posts made where people can give comments and express their opinions on matters, these provisions may reduce the rate at which certain contents are posted even though, they are for awareness or engagement purposes.

Although, the code requests this disclosure for the purposes of “preventing, investigating or prosecuting an offence concerning the sovereignty and integrity of Nigeria, among others,” the fate of a person whose identity is disclosed is quite precarious. There is no mention that such a person would be seen to have violated the Code or that actions will be instituted against such a person before a tribunal or court in this provision.

However, the Code in Part IV Section 7 on Measures on Disinformation and Misinformation, says that users shall not be liable, without intent, for merely redistributing through intermediaries, the content of which they are not the author and which they have not modified. If this is so, should the Agency require the disclosure of the identity of a person who is not to be held liable?

And note that this only applies to cases of misinformation and disinformation, not posts or content based on real facts. This could mean that persons who repost certain content that can lead to misinformation or disinformation will not be held liable but a person who reposts content based on facts could have his identity disclosed and it could spell for them, especially if the content was originally posted by a person outside Nigeria. This makes the whole affair a fearful one.

The uncertainty surrounding what government use of disclosed information and the possibility of abuse could create anxiety in users, limiting their ability to maximise the benefits of social media for their businesses and civil rights.

The Code defines harmful content as “content which is not unlawful but harmful”. The Code further defines online harm as “action or inaction with reasonably foreseeable risk of having an adverse physical or psychological impact on individuals”. These definitions and their application are ambiguous.

The Code does not take into consideration the differences in the sentiments or idiosyncrasies of people, as what may be harmful to one person may not be harmful to another. Besides, since they are not unlawful posts, people may still want to post them. If users are not clear on what the law requires, they may want to avoid certain posts in order not to incur the wrath of the law.

The provisions of the Code are open to a lot of different interpretations, and such a regulation, although with the intent to cure an ill, may bring unfavourable collateral damage, which includes stifling business. Where people do not feel safe about the things that they post, their posts are limited and they may not successfully engage the large number of people which could bring about greater interactions for business or other purposes. Thus, these different provisions need to be made clearer.

Further, the section does not say which Court can issue such an order. A liberal interpretation of this provision would mean “any court”. This will mean a court that lacks jurisdiction in matters regarding Information Technology is sucked in by this provision to have jurisdiction in the matter. The Court to give such an order must be explicitly defined by the Code. The Court of competent jurisdiction, in this instance, may be the Federal High Court and/or the State High Courts.

A provision to also look at is Part III Section 5 of the Code. This provides that large service platforms (which are platforms with more than one hundred thousand users) are to, “on demand, furnish a user, or authorized government agency with information on the reason behind popular online content demand and the factor or figure behind the influence. A similar provision, Section 6 of the same Part also provides that any user or authorized agency shall be provided with a report of due process on their activities.

Again, these provisions are ambiguous and give a floodgate of interpretations. First, the Code defines a user as any person who is registered or unregistered with a Platform and uses, accesses, publishes, shares, transacts, views, displays, engages, downloads, or uploads any information on the Platform’s platform. This means that one does not have to be registered under a platform to be able to make such a demand or request from the Platform. Will this not mean an overstretching of the obligations of the Platforms to cover persons who are not registered on their platforms? Platforms may not want to make such disclosures, more so, to persons not utilizing their platforms. Although the Agency revealed that the operators of these Platforms were involved in the establishment of the Code, one finds the provision muddling.

Conclusion
The Code seems ambiguous in many of its provisions. These ambiguities can be resolved before the coming into effect of the Code through the express definition of terms, and explicit provisions to better drive home the intention of the Agency.

It is true that we cannot allow disinformation, hate speech and the rest to go unchecked but regulation should not be seen to have the intent of limiting the right to freedom of expression of persons to a detrimental level. These platforms enable greater knowledge and wider participation in political and social activities. It is important that people feel safe in expressing themselves.

The Agency must also be wary of the fact that the regulation is not so stiff-necked as to chase away Platform operators. This is because they are aware of the fact that there are available technologies such as Virtual Private Network (VPN) that can be used to bypass a government ban of the platforms in the event that a ban occurs.

The aim, therefore, should be achieving a democratic social media regulation- a democracy much more than the regular multiparty elections where users feel able to, and have the resources to, challenge those in power, make political and business decisions based on reason rather than being swayed by power, bias or fear. Certain collateral damages may come with regulation of various sectors but such should not be detrimental to the very things that are fundamental to persons – their rights.