A Tool To Curb The Spread Of Misinformation And Disinformation Online.
A Tool To Curb The Spread Of Misinformation And Disinformation Online.
In collaboration with United Nations Development Programme Accelerator Lab Kenya and the Healthy Internet Project (HIP) incubated at Technology, Entertainment and Design (TED), Busara Center for Behavioral Economics conducted a live behavioral science experimental demonstration of the Healthy Internet Project (HIP) plug-in, an open-source web browser extension that allows users to flag content online anonymously: it is intended to help curb the spread of lies, abuse, and fear-mongering, as well as to uplift useful ideas on the internet.
The experiment was to understand potential users' motivations, experiences, and practices in using the platform to flag misinformation. Information pollution, where facts and figures become a source of societal division, impacts behavior, social cohesion, and public trust. If not addressed, misinformation and disinformation can undermine civic culture by promoting general mistrust and encouraging sub-optimal behaviors.
When information is false or misleading but spread without the intention of harm, it is misinformation. On the other hand, when information is spread deliberately to harm and benefit certain interest groups, it is disinformation. These two types of information are often being served to audiences alongside and with the same weight as the truth. Audiences must critically evaluate any data or knowledge to identify and flag misinformation and disinformation successfully. However, given the abundant information available, especially online, it is much easier for people to fall for false information rather than sift through and objectively analyze it themselves.
Implementing robust programs in media literacy, as a topic that emerged in the discussion for both adults and children, is again necessary to combat the spread of misinformation effectively. Media literacy is a tool to prepare future generations to combat the rising tide of information warfare. The Youth Café seeks to equip young people with critical media literacy skills: critical thinking, fact-checking, online safety, social media verification, and quality assessment of online information and sources through a Digital Media and Information Literacy Handbook.
Now more than ever, The Youth Café needs to enhance the fact-checking skills of the youth to restore eroded trust by fake news, improve their civic online reasoning and encourage responsible social media usage. These skills are critical to reducing political incitement, political strife, tarnished political images, and hate speech in the electoral context. These skills are essential in restoring and consolidating democracy in Kenya.
Starting with a quantitative live experiment, Busara Center for Behavioral Economics observed natural behaviors (such as user experience, motivations, accuracy, and demographic trends) of 128 users on the platform, followed by a qualitative exercise with 44 of these users. Respondents were pooled from five counties (Kajiado, Kiambu, Machakos, Murang'a, and Nairobi) and represented diverse age and ethnic groups and levels of education. The qualitative exercise sought to understand context-specific insights related to user motivations through in-depth interviews and a Focus Group Discussion. The participants were then classified as either active, moderate, or low users based on how frequently they used the platform. Most of our study participants were considered low users, while only three were deemed to be active users.
Key findings from the experiment revealed that most participants deemed the Healthy Internet Project (HIP) an appropriate tool for stopping the spread of misinformation. However, internet challenges and infrequent encounters with harmful content were cited as contributors to the platform's low usage. Participants also mentioned the lack of feedback mechanisms on their flagged content, not having a computer to access the Healthy Internet Project (HIP) tool, and the rare usage of the internet.
Most people either use their judgment or intuition to determine whether the information they come across is harmful content or check to see whether it would be harmful to them or others in society. Interestingly, even though the tool intends to stop the spread of misinformation, 75% of participants used the tool to flag helpful content. This was due to concerns that flagging negative content: was more subjective; might have led to harmful repercussions for those who are flagged; and was personally risky, especially regarding political content.
Naturally, anonymity became a concern. Users feared that they would be identified through the platform use, thus increasing skepticism and aversion to using Healthy Internet Project (HIP) despite assurances that all the flagged content would be anonymous. For user accuracy, the support of PesaCheck, Africa's largest indigenous fact-checking organization, was engaged to validate a sample of the claims associated with the flagging activity from the study. Misinformation was related to negative sentiments, such as a dislike for a topic, rather than misinformation itself.
Additionally, it was difficult to know what constituted misinformation amongst flagged content because users hardly specified what was misinforming about the websites they were on. Finally, only 40% of the 128 study participants flagged more than one item using the Healthy Internet Project (HIP) plug-in, effectively reducing data diversity. With these limitations, the generalizability of results was unattainable.
To improve the use and functionality of the Healthy Internet Project (HIP), the recommendations arising from the experiment are: There should be more details to convince users of their anonymity to address the risks they feel on reporting misinformation. A detailed description of misinformation should be present to increase the accuracy of user reports. Removal of the "worthwhile" flag to solidify the purpose of the plug-in and have more flagging options with simplified definitions for each, such as "cruelty, violence or intimidation" instead of "abuse or harassment."
Addition of a required "misinformation identification" field for easier fact-checking since users will specify the content they regard as misinformation, for example identifying specific phrases or sentences rather than linking to a complete article. Develop a phone version to improve the tool's responsiveness, incentivize active users, enable social media flagging, and translate to other languages. Provision of a simple system to demonstrate how feedback is being actioned to increase usage of the tool and prove that user behaviors make a difference. This may be done by connecting fact-checkers to review the database of flagged content and feeding back the findings to the users.
The Youth Café works with young men and women around Africa as a trailblazer in advancing youth-led approaches toward achieving sustainable development, social equity, innovative solutions, community resilience, and transformative change.
Contact us for any comments or suggestions.