«“Plz RT”: A Social Diffusion Model of Misinformation and Disinformation for Understanding Human Information Behaviour Natascha A. Karlova and ...»
Additionally, disinformation (as well as misinformation) may reveal the ignorance of the speaker. Disinformation may imply partial disclosure or a false state of the world. For example, imagine that Alice is an expert on giraffes and Erik, perhaps unaware of the extent of her expertise, confidently tries to convince her that giraffes are officially listed as an endangered species. From this exchange, Alice might: 1) suspect that Erik is trying to deceive her and start questioning his intent and/or 2) believe that Erik is simply misinformed about the state of the world (both of these responses are Proceedings of the ISIC2012 (Tokyo) 6 of 17 equally possible). These hypothetical examples suggest that perhaps misinformation and disinformation provide different levels of informativeness, depending on the situation.
Model The field of information behaviour has a strong tradition of model-building to help explain ideas (e.g., Fisher, Erdelez, et al., 2005), and we harness this tradition to introduce a social diffusion model of misinformation and disinformation. In order to accurately describe misinforming and disinforming as information behaviour, the model (Figure 1) depicts information, misinformation, and disinformation as products of social processes, illustrating how information, misinformation, and disinformation are formed, disseminated, judged, and used in terms of key elements, beginning with milieux.
Figure 1. Social diffusion model of information, misinformation, and disinformation Proceedings of the ISIC2012 (Tokyo) 7 of 17 Milieux Information does not form in a vacuum.
Our model seeks inclusivity and context-awareness. Social, cultural, and historical aspects may influence how information, misinformation, disinformation, cues to credibility, and cues to deception are perceived and used. For example, the misinformation and disinformation diffusing throughout Europe about Germany’s rising economic influence may stem from Germany’s history. This example illustrates how information can be perceived as either misinformation or disinformation, depending on the social, cultural, and historical contexts. In this model, information, misinformation, and disinformation are socially-, culturally-, and historically-mediated.
Diffusion Given these elements, personal and professional social networks involving positive, negative and latent ties of varying strengths, are leveraged to diffuse information, misinformation, and disinformation over time. Even if they may not believe such information themselves, people (and governments and businesses) share information, and they may not recognize it as inaccurate or deceptive. Naturally, as information diffuses, cues to inaccuracy or deception may change, disappear, or emerge.
Diffusion may be rapid, as in an emergency situation (e.g., an earthquake) or a political mobilization (e.g., #TahrirSquare, #ows); it may adopt a leisurely pace (perhaps because it is low-impact or inconsequential); or it may take a much longer time to diffuse, perhaps due to variabilities such as relevance, value, etc. Information, misinformation, and disinformation also diffuse across geographies, as they travel through social groups across the globe. Social media technologies, such as Twitter and Facebook, have made the diffusion of information, misinformation, and disinformation easier and faster.
Unknowns Information, misinformation, and disinformation are diffused by people, governments, and businesses. The intents behind such diffusion, however, are unknown because they cannot be known. Human intentionality is typically vague and mercurial; it is difficult to know–with any level of certainty–the precise intent of another human being at any given moment. Of course, the presence of intent in communication cannot be denied. But even if asked about their intents, people may be unwilling to express their true intents, unable to recall their original intents, or unable to articulate them. The diffusion of inaccurate and deceptive information may be motivated by benevolent or antagonistic intents, but the nature or degree of the intent cannot be determined solely by behaviours or discourse. Information, misinformation, and disinformation may be diffused without being believed by the speaker. Even when a statement of belief is expressed, the intents behind that statement are unclear. A speaker may diffuse such information as an expression of identity or of relationship among the community, or as a result of perceived social pressure. Some information, misinformation, and disinformation may be believed sometimes by some people, governments, and businesses. But the reasons for belief are as unknowable as the nature of human intentionality.
Proceedings of the ISIC2012 (Tokyo) 8 of 17 Deception After the receivers’ and diffusers’ unknowns in the model depiction, the process usually produces information, misinformation, and/or disinformation. In the production of disinformation, deceivers attempt to deceive. They can only attempt because even when intent to deceive is present, deception does not guarantee success in the accomplishment of goals, regardless of whether they are personally-motivated or socially-motivated. Rubin (2010) cited Walczyk, et al. (2008), who argued that deception allows the accomplishment of goals both malevolent (such as suggesting that a co-worker has been embezzling money) and benevolent (such as lying about a surprise party for a friend). People often disinform in the service of socially acceptable expectations, such as the performance of community membership, adherence to cultural values, avoidance of an argument, etc. In these cases, it seems inappropriate to describe people’s motivations as antagonistic, yet neither do they seem obviously benevolent either.
Such a variety of goals illustrate why deception is so complex, and why the nature of intent is often unknowable. For example, if deception is occasionally socially acceptable, then the idea of intent as either antagonistic or benevolent becomes a false dichotomy and challenges whether these are the appropriate views on the topic of intent.
Therefore, it may be best to view cues to deception as context-dependent or relationship-dependent, such that there might be different sets of cues for different contexts or relationships.
Judgement Regardless of whether diffusers are attempting to deceive, receivers make judgements about their believability using cues to credibility and cues to deception.
Deceivers use cues to credibility to achieve deception. For example, phishing emailsv purport to be from legitimate companies (e.g., eBay, PayPal, Facebook, Twitter, etc.) in order to obtain personal information. These emails often use a believable domain name (e.g., email@example.com), the company’s logo and font, and the company’s physical mailing address location as cues to credibility. As a defense against such deception, receivers may rely on cues to deception. For example, phishing emails may include information in the header of the email that may indicate its origination, egregiously incorrect spelling and grammar, and external hyperlinks that, when a mouse cursor rolls over them, reveal a suspicious or bogus URL (e.g., hottgirlz.com). But, as described earlier, much of the interpretation of misinformation and disinformation can be influenced by social, historical, and cultural factors.
Thus while cues to credibility may be used by deceivers to deceive and cues to deception may be used by receivers to defend against deception, neither set may be successful in deception or defense. Much depends on the degree to which receivers may suspect misinformation or disinformation, or the degree to which certain aspects of messages strike them as suspicious, bogus, or benign. Significantly, this section of the model represents the convergence of the information literacy behaviours occurring constantly and simultaneously throughout the entire course of the model.
Proceedings of the ISIC2012 (Tokyo) 9 of 17 Use As receivers use their information literacy skills to make judgements about information, misinformation, and disinformation, this information is used by people, governments, and businesses to make decisions and take action. When recognized as such, information, misinformation, and disinformation can be valuable to people, governments, and businesses. Correcting inaccurate information can present opportunities for meaningful engagement, public awareness and education, and commercial information service provision. People can use disinformation to harness influence over others (e.g., insinuating knowledge of personal information).
Governments can use disinformation to exercise control over a populace. Businesses can use disinformation to maintain or repair their own reputation or to damage the reputation of a competitor. These examples suggest only a few of the ways that disinformation can be used. Misinformation and disinformation, if recognized, can also be sold or traded, diffused out through social groups, used to attempt to deceive, etc.
Misinformation and disinformation may be used immediately in a situation, soon after receiving such information, or it may be kept dormant for later use or verification.
Changes in context may influence how or whether misinformation or disinformation is used. Because the world can change so rapidly, the information that receivers’ may have judged as misinformation or disinformation can quickly become information, misinformation, or disinformation. For example, after the earthquake in Japan in March 2011, a Twitter user tweetedvi that his friend needed to be rescued, and asked others to retweet the message. The friend was rescued the following day, but people continued to retweet. This example illustrates how true, accurate information can become misinformation due to a change in context.
Discussion – Implications for information literacy Information, misinformation, and disinformation develop in social, cultural, and historical milieux. From this, they develop over time and are diffused by diffusers.
The intents and beliefs of diffusers and receivers are unknown, however. Diffusers may attempt deception, and, in response, receivers will exercise judgement by looking for cues to credibility and to deception. Finally, both diffusers and receivers may use information, misinformation, and disinformation. This process highlights the need for critical analysis of diffusion and information sharing, and of cues to credibility and their usage. These are elements of information literacy, activities in which people engage as they ferret out the nuances of misinformation and disinformation. In this section we discussion the implications of our proposed model for information literacy.
Since its 1974 coining by Zurkowski, information literacy has been a core service of libraries with definitions and competencies adopted by organizations worldwide (AASL/AECT, 1998; ACRL, 2008; Garner, 2006), several guides (Lau, 2006; Sayers, 2006) and models, e.g., radical change (Dresang, 2005), Big6 (Lowe & Eisenberg, 2005), seven faces (Bruce, 1997), seven pillars (SCONUL, 2011), the empowering eight (Wijetunge & Alahakoon, 2005), and the information search process (Kuhlthau, 2005) for basing services in K-12 schools, colleges and the workplace. At its essence, information literacy refers to being able “to identify, locate, evaluate, and effectively use information,” (Garner, 2006).
Proceedings of the ISIC2012 (Tokyo) 10 of 17 Despite observations on the growing importance of context (corporeal and social sources) in understanding and promoting information literacy (see (Courtright,
2008) for a good review), however, the prevailing paradigm focuses on individual users engaged in learning about tools and problem solving processes on their own behalves, sometimes for life-long learning. This orientation reflects Tuominen, Savolainen & Talja’s observation that: “Information Literacy thus far has been more of a practical and strategic concept used by librarians and information specialists rather than the focus of empirical research,” (2005, p. 330). Regarding public libraries, Lloyd and Williamson (2008, p. 7) concluded, “information literacy research is still in its infancy, with very little research pertaining to community perspectives of information literacy being reported,” adding, “in community and cross-cultural settings, IL [information literacy] may also take on a different shape that cannot be accommodated by library-driven frameworks and standards,” (p. 8). Harding (2008) and Walter (2007) also lament the lack of direction for public library participation in information literacy delivery, citing a higher onus of expectation and responsibility.
These calls in the literature for a wider perspective in approaching information literacy, understanding community perspectives, and engaging greater empirical understanding support our observation that conceptual and empirical investigations must include misinformation and disinformation to reflect the complexities of modern life. Two specific areas in which misinformation and disinformation may be regarded within information literacy include diffusion and sharing, and cues to credibility, discussed as follows.