«“Plz RT”: A Social Diffusion Model of Misinformation and Disinformation for Understanding Human Information Behaviour Natascha A. Karlova and ...»
Proceedings of the ISIC2012 (Tokyo) 1 of 17
“Plz RT”: A Social Diffusion Model of Misinformation and
Disinformation for Understanding Human Information
Natascha A. Karlova and Karen E. Fisher
The Information School
University of Washington
Seattle, WA 98195
Introduction. People enjoy sharing information, even when they do not believe it.
Thus, misinformation (inaccurate information) and disinformation (deceptive
information) diffuse throughout social networks, as misinforming and disinforming are varieties of information behaviour. Social media have made such diffusion easier and faster. Many information behaviour models, however, suggest a normative model of information as true, accurate, complete, despite the ubiquity of misinformation and disinformation.
Analysis. Misinformation and disinformation are defined and we show how they extend the concept of information through their informativeness. Table 1 summarizes the features of information, misinformation, and disinformation. Figure 1 illustrates the social diffusion process by which misinforming and disinforming function as types of information behaviour.
Conclusion. Misinformation and disinformation are closely linked to information literacy, especially in terms of how they are diffused and shared and how people use both cues to credibility and cues to deception to make judgements. Misinformation and disinformation present both challenges and opportunities for individuals, businesses, and governments. Future work in immersive, 3D virtual worlds takes a naturalistic approach to understand the principal elements of cues to misinformation and disinformation.
Keywords. Deception, Disinformation, Information Behaviour, Information Diffusion, Information Literacy, Information Sharing, Intentionality, Misinformation, Rumour, Social Network Proceedings of the ISIC2012 (Tokyo) 2 of 17 Introduction In Japan after the March 2011 earthquake, radiation leaked from the Fukushima nuclear power station. The Japanese government declared they are working on a cleanup effort, but people question whether this is truei, and are uncertain about the safety of returning to their homesii.
In spring 2012 European economies are struggling, and German Chancellor Merkel is making political moves to bring other countries in line with German fiscal policies. Consequently, there are also growing concerns about Germany’s power in the European Unioniii, and people are wondering about the implications of strong German influence on Eurozone economic policiesiv.
These examples demonstrate possible consequences of inaccurate and deceptive information: suspicion, fear, worry, anger, and decisions resulting from these consequences. As gossip and rumours abound, it is difficult to distinguish among information, misinformation, and disinformation. People enjoy sharing information, especially when it is ‘news’. Although they may not believe such information themselves, they take pleasure in disseminating it through their social networks. In this way, misinformation (inaccurate information) and disinformation (deceptive information) easily diffuse, over time, across social groups. Social media, such as Twitter and Facebook, have made dissemination and diffusion easier and faster. Highimpact topics, for example, health, politics, finances, and technology trends, are prime sources of misinformation and disinformation in wide-ranging contexts, for example, business, government, and everyday life.
Despite the plethora of inaccurate and misleading information in the media and in online environments, traditional models of information behaviour seem to suggest a normative conception of information as consistently accurate, true, complete, and current, and they neglect to consider whether information might be misinformation (inaccurate information) or disinformation (deceptive information). To better understand the natures of misinforming and disinforming as forms of information behaviour, we build from such normative models and propose our own model of misinformation and disinformation. Our model illustrates how people create and use misinformation and disinformation. In our discussion, we argue that misinformation and disinformation need to be included in considerations of information behaviour, specifically elements of information literacy, because inaccuracies and deceptions permeate much of the world’s information.
The purpose of our paper is thus to: 1) demonstrate how misinformation and disinformation are most usefully viewed as forms of information; 2) illustrate the social diffusion process by which misinforming and disinforming function as types of information behaviour; and 3) elucidate how misinforming and disinforming can be modeled to account for vivid examples in different domains.
Proceedings of the ISIC2012 (Tokyo) 3 of 17 Extending information Information scientists have long debated the nature of information: what it is, where it comes from, the kinds of actions it affords humans, etc. Misinformation and disinformation tend to be limited and understudied areas in efforts to understand the nature of information (Rubin, 2010; Zhou & Zhang, 2007). From its earliest stages, information science has sought to define information, beginning with Shannon and Weaver’s (1949) idea that information can be quantified as bits of a signal transmitted between one sender and one receiver. This model does not clarify understanding misinformation and disinformation because they may carry multiple, often simultaneous levels of bits and signals (as opposed to one signal), and because describing misinformation and disinformation as merely ‘noise’ ignores their informativeness (discussed below). Later, Taylor (Taylor, 1962) argued for the need to study “the conscious within-brain description of the [information] need,” (p. 391). Belkin and Robertson (1976) notably advocated for a view of information as “that which is capable of transforming structure,” (p. 198) of information inside a user’s mind. Dervin and Nilan (1986), in a landmark ARIST paper, contended that information ought to be viewed “as something constructed by human beings,” (p. 16).
Tuominen and Savolainen (1997) articulated a social constructionist view of the nature of information as a “communicative construct which is produced in a social context,”. They focused on discursive action as the means by which people construct information. A constructionist view of information is useful when discussing misinformation and disinformation because it emphasizes social context and conversations among people as ways of determining what information is and what can be informative. Misinforming and disinforming are information behaviours which may occur in discourse between people, and so, through this conversational act, misinformation and disinformation can be information people may use to construct some reality. In this way, misinformation and disinformation are extensions of information.
Misinformation Unfortunately, misinformation does not seem to earn the attention it deserves.
While the Oxford English Dictionary defines misinformation as, “wrong or misleading information,” few authors have discussed the topic in detail. Authors commonly cite the OED definition without further analysis or discussion (e.g., Bednar & Welch, 2008;
Stahl, 2006). Fox’s (1983) pioneering work on misinformation clearly delineated the relationship between information and misinformation. Fox (1983) stated that, “information need not be true,”; that is, there is no reason that information must be true, so misinformation may be false. He wrote that, “misinformation is a species of information,” and thus, drew the relationship clearly: misinformation, albeit false, is still information and, therefore, can still be informative.
In an article about the nature of information, Losee (1997) stated that misinformation may be simply information that is incomplete. Zhou & Zhang (2007) added to this discussion with additional types of misinformation, including concealment, ambivalence, distortion, and falsification (because they do not disambiguate between misinformation and disinformation). However, incomplete and Proceedings of the ISIC2012 (Tokyo) 4 of 17 even irrelevant information may still be true, accurate, current, and informative, therefore, meet many of the same qualifications commonly accepted for information.
Karlova and Lee (2011) added that misinformation may also be inaccurate, uncertain (perhaps by presenting more than one possibility or choice), vague (unclear), or ambiguous (open to multiple interpretations). Information that is incomplete may also be a form of deception, which frequently qualifies as disinformation.
Disinformation The OED describes disinformation as, “deliberately false information,” and states that the term, disinformation, comes from a Russian term, dezinformacija, coined in 1949. Given the political and cultural milieu in the Soviet Union at that time, the strong association between disinformation and negative, malicious intent probably developed as a result of Stalinist information control policies. Since the term disinformation has been created relatively recently, perhaps it is not surprising that not much work has explored the concept. Authors typically treat disinformation as a kind of misinformation (Losee, 1997; Zhou, Burgoon, et al., 2004). Fallis (2009), however, analyzed disinformation to uncover sets of conditions under which disinformation may occur. He concluded that, “…while disinformation will typically be inaccurate, it does not have to be inaccurate. It just has to be misleading. So, disinformation is actually not a proper subset of inaccurate information [misinformation],” (p. 6). Fallis argued that disinformation can be misleading, in the context of a situation. His analysis of disinformation builds further support for a subjective, constructionist view of information, as articulated by Hjørland (2007).
Although disinformation may share properties with information and misinformation (e.g., truth, accuracy, completeness, currency), disinformation is deliberately deceptive information. The intentions behind such deception are unknowable, but may include socially-motivated, benevolent reasons (e.g., lying about a surprise party, adhering to cultural values, demonstrating community membership, etc.) and personally-motivated, antagonistic reasons (e.g., manipulating a competitor’s stock price, controlling a populace, ruining someone’s reputation, etc.). Since misinformation may be false, and since disinformation may be true, misinformation and disinformation must be distinct, yet equal, sub-categories of information.
Informativeness of misinformation and disinformation How can it be that we can be informed by misinformation and disinformation?
Buckland (1991) wrote that, “[b]eing “informative” is situational,” (double quote marks in original). In this sense, “informativeness” depends on the meaning of the informative Proceedings of the ISIC2012 (Tokyo) 5 of 17 thing (e.g., sentence, photo, etc.). Different situations imbue different meanings on different things, and these meanings may depend on the knowledge of the receiver.
Buckland’s idea illustrates why misinformation can be difficult to define and to identify:
what is misinformation in one situation might not be in another because the meanings might be different. The act of disinforming may be weakly situation-dependent compared to misinforming because the intent of the speaker is a constant, even if the speaker does not act on that intent. A deceiver will intend to deceive, regardless of the situation, but someone who simply misinforms may not intend to do so.
However, the success, or failure, of the deceiver may be strongly situationdependent if some aspect of the world changes unbeknownst to the deceiver between the time that he speaks and that the receiver acts upon the disinformation. For example, Jack wishes to deceive Sarah (for unknown reasons) and tells her that the movie starts at 15:30, even though he knows that it starts at 15:00. However, Jack is unaware that the movie theater projector is broken and the movie start is delayed by 30 minutes. When Sarah arrives in time for a 15:30 showing, she may not realize that Jack made either a false or an inaccurate statement. This case illustrates two important aspects of disinformation. Here, the deceiver failed to disinform, despite intent to do so; the informativeness of (dis)information may depend on the situation.
In his influential article, Buckland (1991) advocated the view that, depending on the situation, information is a thing, a process, and knowledge because he focused on wanting to understand informativeness. Misinformation and disinformation may also be things, processes, or knowledge, and therefore informative, by implying or revealing information. The speaker of misinformation may reveal information (perhaps accidentally) or may imply information or state of the world. Misinformation tends to be accidental, but the informativeness of it may depend on the relationship between the speaker and the receiver. Disinformation could possibly be more informative than misinformation, perhaps because any reveal or implication may be deliberate.
Consider an instance in which a speaker provides partially distorted information to the receiver (e.g., “The new phone comes out next year,” when, in fact, the new phone comes out this year). In this case, the receiver is partially informed about the fact that a new phone is coming out. Disinformation may reveal the malicious intent of the speaker. If the receiver happens to know that the new phone comes out this year, she might suspect that the speaker is intending to deceive her. Here, the receiver is informed about the potential intent of the speaker, which is external to the message actually being delivered.