A typical day in our digital space often starts or ends with people bantering over trending news on social media. For some, these platforms are used to learn, teach, contribute, engage, or even decompress from the hard tasks of the day. For others, it is a tool for spreading false agenda. One can say, with all the wealth of information accessed from the internet and the amount of time spent therein, that the new media solidifies or erodes our beliefs by the narratives we consume there regularly.
What often starts as a forwarded WhatsApp message from unknown or pseudo sources, or unverified social media content, has not just revealed our health/news literacy levels but has its disadvantages on specific individuals and the communities at large. These false narratives and images have fueled online wars, led to disagreements, preventable deaths and bloodshed, especially in vulnerable populations and in areas already volatile to conflict.
During the Ebola Virus Disease outbreak, we had a rise in the number of people who consumed salt and water. This hoax resulted in the death of two Nigerians, who were known hypertensive patients. With the COVID-19 pandemic, we only noticed an amplified version of what has been existent: infodemics.
Infodemics are any false and misleading information, such as rumours, fake news, hate speeches, misinformation, and disinformation. Unverified narratives do not just have short-term effects, like harm or death. They also have long-term effects. They tear apart the ethno-religious fabric of a society, fuel terrorism, threaten our peace and existence, and make people lose trust in the (public health) systems with genuine intentions to protect citizens. Stories exist where individuals have refused to speak with, or receive help from Non-Governmental Organizations, for fear that they may be disguised security officials.
What lessons can we learn on how to get better on health/news/digital literacy and address these rumours and misinformation going forward?
One approach is to understand what category of infodemic is being spread on these platforms. Terms like rumour, misinformation, disinformation, and fake news which are closely related and often used interchangeably, do not exactly mean the same thing. The major difference lies in the intention of the spreaders. Misinformation are false stories shared or communicated regardless of an intention to deceive. Disinformation is a type of misinformation deliberately shared to deceive. Rumours are a type of misinformation passed on, often without the intention to mislead. They are unverified and doubtful stories which people often make up during crises, as a way of coping with their anxieties. Fake news is disinformation, intentionally fabricated, sensational, emotional, misleading and distributed in a way that mimics mainstream news. In a nutshell, these stories lack accuracy, balance and credibility.
Most importantly, they depend largely on the intention of the spreaders. As such, understanding the Camille François’ ABC framework for identifying misinformation allows us to find the roots of most viral deception. These include
A- Manipulative Actors are people who engage knowingly and with clear intent in viral deception campaigns. Their intention is to disrupt public health or humanitarian efforts whilst maintaining a seemingly unclear identity and intention. Identifying these actors who manipulate public discourse through their often fake digital profiles and footprints and enforcing laws that keep them from misusing these platforms will create an advantage.
B- Deceptive Behaviours refers to tactics and techniques used by these manipulators and how they are constantly evolving with their spread of false information. These include coordinated and inauthentic behaviour used by these actors to enhance and exaggerate the reach, virality and impact of their campaigns.
C- Harmful Content are narratives widely distributed to to hurt or demean the efforts of individuals, organizations, or the public interest, and influence or amplify the public debate or crises. Of these, content is the most visible form of the components of misinformation, because it is presented in a way that users of the platforms can see and form an opinion on this form of viral deception.
Addressing these misinformation is a second major step towards taking responsibility, as individuals and collectives who are primarily concerned about the physical and psychosocial wellbeing of the communities that we live in. To make this happen in simple steps:
Do a thorough search to know the digital footprints of these actors. Study their social profiles across various online platforms. What kinds of information do they typically spread? Do they operate anonymously, mimic mainstream media or use fake profile pictures? At what frequency do they share these content? Do these online accounts seem to have a bot periodically sharing scheduled fake and volatile stories? If they have profile pictures, do a reverse image search on Google to see what you may find. What does their bio, religious, political or social affiliations show? How about their tone and choice of words? These profile details may likely affect the nature of the content published
Does the content shared look too good to be true? Watch out for loopholes, inconsistencies, and questions that beg your gut instincts for answers.
Search online for a corroborating story on the same topic. One easy way is to use the keywords or title that form the story and include the terms “fake news” in the search words. What comes up on Google? Similar stories from a mainstream media? If not, then it may not be true.
Bad publicity is still publicity. Do not amplify unverified news until there is a confirmation by relevant or trusted sources, that it is true. It is easier for one to change one’s mind when one eventually finds the truth, than it is to change tens, hundreds and thousands of minds, who may be too gullible to seek or find the truth long after the wrong information may have been shared. Treat all viral deception and incorrect stories as an intentional decision made by manipulative actors to cause harm.
Conclusively, stakeholders need to understand how and where people get their stories. Consumer behaviours keep changing as more people take advantage of the digital economy. We find a shift in how people source, read and share news with ease, leveraging digital technologies. Addressing misinformation on these platforms would require being present on these platforms and meeting people where they are, as one cannot effectively influence a people or system they do not engage with. Being present and countering fake news using platforms in the language that these audiences can understand, is one great step towards sharing viral health information that empowers users to see the difference and amplify the truth with their own network.