It is about time that online misinformation is taken more seriously. The political, social and economic effects of misinformation can be formidable.
Online social media has brought together billions of people around the world. The impact of diverse platforms such as Facebook, WeChat, Reddit, LinkedIn etc. has been transformational. The number of active users of the six most popular online social networks combined is estimated at about 10 billion. The World Wide Web (the Web) is a place where online content can be created, consumed and diffused without any real intermediary. This empowering aspect of the Web is generally a force for good: people are now generally better informed and participation in online discussion is more inclusive (barriers to participation are reduced).
But researchers have increasingly highlighted the darker side of online social media. The general absence of intermediaries online (journalists are intermediaries in traditional print media) allows a free-for-all direct path from producers of questionable content to consumers. Malicious actors take advantage of this gap. Two disturbing trends have been highlighted: ‘information disorder’ and ‘echo chambers’. Misinformation (false or misleading information regardless of intent), disinformation (false and misleading information with intention to deceive) and malinformation (information true or false with intent to cause harm) are common types of information disorders.
The World Economic Forum highlighted information disorder as a threat to society. Echo chambers online consist of spaces where people are exposed only to content created by like-minded users of the platform. Similar to echo chambers are filter bubbles, artificially created by algorithms using the user’s online history to further suggest or recommend other content. Echo chambers and filter bubbles reduce the quality of discourse online and can directly and indirectly lead to the creation and diffusion of biased and unsubstantiated content.
Researchers have studied the spread of conspiracy theories within echo chambers and filter bubbles. One study discovered that on the Twitter platform, untruth was retweeted quicker and by more people than truthful content (70 percent more retweets). One research revealed that false content reached 1500 people six times as fast as it took truthful content to reach the same number of people. Some analysts see online misinformation as a national security challenge and equipped with cybersecurity tools, they track misinformation the way they would track malware, to prevent “the hacking of people’s beliefs.”
Facebook complained years ago that “bots” have invaded its platform, an estimated 60 million bots. Bots are fake accounts that focus on the spread of false content. Twitter is on spared: it was estimated that up to 15 percent of Twitter accounts are actually not human accounts but bots.
In an article “Fake News: understanding the scourge in Nigeria” Raji Raski explains the misinformation ecosystem in Nigeria: “‘fake news’ thrives in Nigeria in its different variants. These variants include misinformation, disinformation and mal-information…the nation’s culture of “closed” (as opposed to open) governance, which thrives on official secrecy and dearth of timely official information is a recipe for the scourge to spread…Nigeria’s [increasing] population on social media and other digital space is an escape route from muffled voices in the mainstream; an avenue to create, share and distribute contents of all sorts, many of which populate the misinformation ecosystem in Nigeria.”
A conspiracy theory can be considered as an attempt to simplify reality: a dangerous shortcut. Framers of a conspiracy theory (and the believers) try to simplify the causes of complex social or political reality. Mental impatience. Research led by Virem Swami of the University of Westminster revealed that thinking dispositions influence people’s belief in conspiracy theories. Results from the examination by Swami of the link between measures of thinking disposition and belief in conspiracy theories revealed “a stronger belief in conspiracy theories was significantly associated with lower analytic thinking and open-mindedness and greater intuitive thinking.” Other researchers have shown that it may be possible through analytic training to “inoculate” people against the invasion of conspiracy theories hacking people’s beliefs.
The time has come for governments and civil society to prioritize online literacy and analytical thinking.
Omoregie is research analyst and a fellow of the Institute of Management Consultants.
@UyiosaOM (Twitter) [email protected]
Join BusinessDay whatsapp Channel, to stay up to date
Open In Whatsapp