FREE ELECTRONIC LIBRARY - Dissertations, online materials

Pages:     | 1 | 2 || 4 |

«“Plz RT”: A Social Diffusion Model of Misinformation and Disinformation for Understanding Human Information Behaviour Natascha A. Karlova and ...»

-- [ Page 3 ] --

Diffusion and sharing As discussed, people enjoy sharing information or are naturally inclined, especially when it piques their interest, or if they think a friend would benefit (Coward & Fisher, 2010; Shibutani, 1955). In this way, people diffuse information through networks over time. Not all networks, however, are connected. Burt (1992, 2004) uses the term, “structural holes,” to describe disconnections, the empty space, between networks. To fill these holes and link sets of networks, information brokers serve a crucial role in the diffusion of information. Because information brokers connect sets of networks, they are enormously powerful, and may easily diffuse misinformation and disinformation with limited consequence to their reputation within a network. Further, members of disparate networks may be unable to verify information received from their network’s broker because that broker may be the sole contact between networks. Again, the ambiguity of human intentionality clouds the motivations behind information brokers diffusing a piece of information between otherwise disconnected networks. As information diffuses through networks, it can reach a saturation point, such that most or all of the people in the network are aware of the information, misinformation, or disinformation. They may not, however, recognize misinformation or disinformation as such. In her work on the diffusion of employment-related information among lowincome workers, Chatman (1986) found that, “information has limited utility when diffused,” (p. 384). Here, Chatman described a complete saturation of information in a (relatively) small network with few connections to other networks. In terms of misinformation and disinformation, however, diffusion saturation may not reduce utility. For example, misinformation may be incomplete or inaccurate, and may become Proceedings of the ISIC2012 (Tokyo) 11 of 17 useful, even after complete saturation, when members of a network find other information to complete or correct the misinformation. When disinformation, for example, has diffused throughout a network, an outsider may use that disinformation to deceive some members of that network.

Cues to credibility In networks, misinformation and disinformation may spread easily because cues to deception and cues to credibility may shift in their meaning, relevance, and context. A cue to credibility (or deception) in one network may function as a cue to deception (or credibility) in another network. For example, wearing a business suit among business executives may provide credibility in that network, but may cue deception among artists. Information users look for cues to credibility when making judgements about information. In these situations, cues to credibility are necessary tools, both for users and for creators of information. Cues to credibility communicate legitimacy and trustworthiness to an audience. Deceivers, however, also rely on cues to credibility – often for the same goals as non-deceivers (e.g., trust, believability, etc.).

For example, deceptive political advertising may feature actors dressed as firefighters and police officers because these people have influence and respect in the community.

The advertisement, in this case, leverages the community’s esteem for these people as a cue to credibility. Because cues to credibility can be used in deceptive ways, their utility becomes questionable. Perhaps common cues to credibility have become too easily malleable. For example, ordinary consumers may not know whether an item on eBay is authentic or fake when sellers use common cues to credibility (e.g., official logos, photos, etc.). Deceivers also often reveal, or leave behind, cues to deception.

Cues to deception can include physical cues (e.g., dilated pupils, elevated heart rate, etc. (DePaulo, Lindsay, et al., 2003)), verbal cues (often influenced by the type of relationship; (Buller & Burgoon, 1996)), and textual cues (e.g., excessive quantity, reduced complexity; (Zhou, Burgoon, et al., 2004)). Online environments (e.g., Facebook, Twitter, World of Warcraft, etc.) may offer additional or alternate sets of cues; future research may help uncover such sets. While cues to credibility can be used by both deceivers and non-deceivers to influence receivers of information, we argue that cues to deception may be more useful to ordinary consumers because cues to deception generally are not used by deceivers to convince others. Cues to deception can be perceived by information receivers and used as a defense against such deception, and to make judgements about the likelihood of deception. Nonetheless, because consumers may use a combination of cues to credibility and cues to deception to form judgements about information, information literacy efforts should include ways of recognizing misinformation and disinformation.

Conclusions Misinforming and disinforming are forms of information behaviour, specifically regarding information literacy. Misinformation and disinformation extend the concept of information by their informativeness. A chart (Table 1) summarizes the features of information, misinformation, and disinformation. A model (Figure 1) illustrates how information, misinformation, and disinformation evolve through social diffusion processes. In this model, information, misinformation, and disinformation are Proceedings of the ISIC2012 (Tokyo) 12 of 17 generated in various cultural, historical, and social milieux, and through processes of social diffusion, people come to make judgements (and deceivers may seek to deceive) influenced by their degree of suspicion, and either do not use or use information, misinformation, and disinformation. Because misinforming and disinforming are types of information behaviour, their links to information literacy are numerous, including diffusion and sharing and cues to credibility.

In this paper, we are not trying to determine or describe how people make judgements about information or about cues to deception or credibility. We are, however, suggesting that these topics demand further attention from information science. If information science is to understand the nature of trustworthiness, credibility, cognitive authority, and related topics, then misinformation, disinformation, and cues to deception must be included in the research agenda. We anticipate that this model will open additional avenues for further research.

Additionally, the truth or falsity of information cannot be determined ‘objectively’. We hope, however, that information literacy efforts are sufficiently effective, such that individuals and teams can make situational decisions about the degree of truth or falsity of information. That is, regardless of whether truth or falsity can or cannot be determined ‘objectively’, people still need information and make decisions about it based on their subjective determinations of truth or falsity.

Misinformation and disinformation can have serious consequences for governments, people, businesses, information professionals, and user experience designers, as well as other groups. Misinformation is problematic largely because it can create confusion and mistrust among receivers, and can make information difficult to use. For example, receivers may feel uncertain about the information, and therefore, uncertain about whether they can take action or make a decision. If receivers recognize the errors, they may seek another information source, repeat their previous work, or compensate in some other way. For librarians, information architects, and other information professionals, misinformation in metadata may cause web pages, for example, to be incorrectly indexed, and absent from appropriate search results. User experience designers should understand that when confusing or even conflicting information is presented to users, it can ‘break’ the user experience by disrupting the flow of use. Misinformation can cause credibility problems as well. When governments or companies provide erroneous information, for example, receivers may question whether the government or company is legitimate and an authoritative source. Receivers may also begin to suspect whether information that appears to be inaccurate may actually be misleading, disinformation. Additionally, misinformation is, however, not always easily detected. An exploration of how people determine and use cues to misinformation can illuminate methods of detection. But the difficulty in detection is only one aspect of misinformation.

Misinformation offers opportunities for users to leverage their experiences to improve available information. For example, when e-government initiatives use crowdsourcing, misinformation can be corrected in datasets, bus schedules, city council meeting notes, voters’ guides, and other information produced and disseminated by Proceedings of the ISIC2012 (Tokyo) 13 of 17 governments at all levels. When the public is invited to improve or correct misinformation about a product, company, or service, such as errors in books or users’ manuals, misinformation can offer opportunities for engagement and create lasting and meaningful experiences for users and consumers. Because misinformation may result from accidental errors, experts, such as medical doctors, scientists, and other professionals, can seize an opportunity to educate information users. Because misinformation can be difficult to detect, governments, companies, and professionals can harness misinformation as opportunities for crowd-sourcing corrections, for meaningful engagement, and for education.

Disinformation can have significant consequences for individuals, governments, and companies. When individuals believe deceptive information, it can influence their actions and decisions. For example, when her accountant lies to Kelly by telling her she owes more in taxes than she actually does, Kelly may decide to seek out a second opinion, or give her accountant additional money. In this example, her accountant disinforms Kelly by leading her to believe an inaccurate situation.

Governments may also be susceptible to disinformation. For example, a gang running drugs may try to deceive law enforcement about their location by discussing their location over the phone (knowing it has been tapped). In this case, the gang disinforms law enforcement by leading them to believe a false situation. Disinformation can cause negative effects for businesses as well. A company’s reputation may be damaged, perhaps due to a competitor, market speculators, or industry-wide struggles. For example, a rumor about a possible bank failure may cause a run on the bank, as happened in December 2011 in Latviavii. In a state of information uncertainty, people queued for hours to withdraw their funds because they did not trust the bank.

Disinformation also provides business opportunities. Online Reputation Management (ORM) firms, such as Metal Rabbit Media and Reputation.com, rely on disinformation to serve their clients. In ORM, people try to control the information about them available online, particularly via search engines such as Google. Some people find the task of ORM sufficiently daunting to hire professional ORM firms to manage their online reputationviii. In the service of their clients, these companies can provide websites, portfolios, Twitter streams, blogs, Flickr accounts, Facebook pages, etc. The extent of the content depends on the level of service for which the client pays.

ORM firms can also leverage Search Engine Optimization (SEO) techniques to ensure that their content appears towards the top of a search result list. These firms’ services exemplify disinformation because the information they provide is often true, accurate, and current – yet deceptive. It is intended to show the client in a different light. By studying highly nuanced cues to disinformation, these firms can improve their services and leave smaller, less noticeable cues. The field of marketing offers numerous opportunities to harness misinformation and disinformation. Guerilla, undercover, and viral marketing may use deceptive techniques (e.g., evasion, exclusion, vagueness) to market products and services to often unsuspecting consumers. Coolhunting is a type of marketing dependent on rumor tracking, and is strongly subject to misinformation and disinformation because the aesthetic of cool often requires secrecy and because rumors may be unreliable information sources.

Proceedings of the ISIC2012 (Tokyo) 14 of 17 In future work, we are adapting a naturalistic approach to observe and capture the richness and dynamism of misinforming and disinforming as forms of information behaviour and literacy in a real-life setting, such as 3D, immersive, virtual worlds (e.g., Second Life, World of Warcraft, Star Wars: The Old Republic, etc.). These environments present the challenges of computer-mediated communication, such as a lack of physical cues, but may offer opportunities for users to use other cues to disambiguate between misinformation and disinformation, thus serve as excellent candidates for the exploration of the concepts discussed in this paper. While the work on textual cues in email exchanges (Zhou, Burgoon, et al., 2004) begins to address the lack of physical or verbal cues, work in 3D, immersive, worlds may provide additional or alternate sets of cues. These new sets may prove useful as immersive environments become increasingly common in classrooms and boardrooms. For example, as players develop their skills, there is some evidence to suggest that they develop “avatar literacy.” That is, they can “read” an avatar, visually, and learn a great deal of information. Teamwork, however, plays a key role in many online games. Players selforganize into distributed, often asynchronous teams. Recognizing the influence of team dynamics, a few questions arise: How might such teams collaboratively distinguish cues to misinformation from cues to disinformation? How might a team agree–to some extent–on what constitutes a “cue”? How might they respond to or use such cues? Much of the misinformation, disinformation, and cues literature has focused on individual deceivers or pairs of conversants. By focusing on teams, we can better understand the core elements of misinformation and disinformation: relationships and context.

Proceedings of the ISIC2012 (Tokyo) 15 of 17


AASL/AECT (1998). Information power: Building partnerships for learning. Chicago:

American Library Association.

Pages:     | 1 | 2 || 4 |

Similar works:

«2015 AMERIND Risk | NAIHC Annual Convention & Tradeshow Contents Page Title 3 AMERIND Risk | NAIHC Staff 4 Message from AMERIND Risk Chairman 5 Message from NAIHC Chairwoman 6 Message from AMERIND Risk CEO 7 Message from NAIHC Executive Director 8 Message from AMERIND Risk Diamond Sponsor 10 Message from the Pima-Maricopa Indian Community 11 Message from the Mayor of Scottsdale 12 AMERIND Risk Board of Directors 13 NAIHC Board of Directors 14 Hotel Map 15 Information/Emergencies 16 NAIHC Thank...»

«US PE/VC Benchmark Commentary Quarter Ending March 31, 2014 Overview US private equity and venture capital funds began 2014 with positive first quarter returns, as indicated by the Cambridge Associates LLC benchmark indexes. Compared to their strong fourth quarter performance, private equity and venture capital fund returns were lower for the quarter, as were public equity returns. For the second quarter in a row the venture capital index outperformed the private equity benchmark. Both indexes...»

«Szent István University CRYOPRESERVATION OF WELS CATFISH (SILURUS GLANIS) AND PIKEPERCH (SANDER LUCIOPERCA) SPERM ACCORDING TO PRACTICAL ASPECTS Thesis of Ph.D. dissertation Zoltán Bokor GÖDÖLLŐ 2009 A doktori iskola megnevezése: Állattenyésztés-tudományi Doktori Iskola tudományága: Mezőgazdaság-tudomány alprogram: Halbiológia és halgazdálkodás vezetője: Dr. Mézes Miklós egyetemi tanár, MTA doktora SZIE, Mezőgazdaságés Környezettudományi Kar Állattudományi Alapok...»

«EN Case No COMP/M.3579 WPP / GREY Only the English text is available and authentic. REGULATION (EC) No 139/2004 MERGER PROCEDURE Article 6(1)(b) NON-OPPOSITION Date: 24/01/2005 In electronic form on the EUR-Lex website under document number 32005M3579 Office for Official Publications of the European Communities L-2985 Luxembourg COMMISSION OF THE EUROPEAN COMMUNITIES Brussels, 24.01.2005 In the published version of this decision, some SG-Greffe(2005) D/200313 information has been omitted...»


«Dulce pecado (Duke Trilogy #1) by Adele Ashworth Adele Ashworth A él lo llaman «el duque del pecado». Es William Raleigh, el duque de Trent, un famoso truhán poco dado a dejarse ver.A él lo llaman «el duque del pecado». Es William Raleigh, el duque de Trent, un famoso truhán poco dado a dejarse ver en sociedad. Dulce pecado (Duke Trilogy #1) Ella se dedica a vender flores. Es Vivian Rael-Lamont, cuya vida alejada de l. a. gran ciudad parece perfecta. Pero un día recibe un anónimo con...»

«VASSAR STUDENT ASSOCIATION Council Agenda for February 22, 2015 Time: 7:00pm Location: College Center 223 “When I dare to be powerfulto use my strength in the service of my vision, then it becomes less and less important whether I am afraid.”-Audre Lorde Call to Order 1 Start: 7:03 2 Attendance President: Can the at-large members introduce themselves? At-Large: I'm the student assistant to the President. At-Large: I'm the News Editor of the Misc. At-Large: I'm the Social Media Editor of the...»

«Setups for Photographing Songbirds John James Audubon !  John James Audubon was an American ornithologist, naturalist and painter. !  He was known for documenting all species of American birds and for his detailed illustrations of birds in their natural habitats. !  His major publication, “The Birds of America (1827 – 1839) was a color-plate book still considered to be the finest ornithological work ever completed. !  Audubon died in 1851 and the National Audubon Society was founded...»

«CLARENCE THOMAS’S JURISPRUDENCE UNEXPLAINED Timothy Sandefur* Introduction I. Unenumerated Rights and Substantive Due Process II. Federalism III. Originalism IV. Conclusion Review of: The Supreme Court Opinions of Clarence Thomas, 1991– 2006: A Conservative’s Perspective by Henry Mark Holzer (Jefferson, N.C.: McFarland & Company, 2007) INTRODUCTION Clarence Thomas is the most interesting justice to sit on the Supreme Court in a generation. His opinions are rigorous, consistent, and...»

«Newborn Care FOR EB Newborn Care FOR EB The Newborn with EB Skin and Wound Care Treatment Steps Missing Skin Pain Oral Blisters/Lesions Infection Nutrition Circumcision Diapers Hand Care Bedding and Crib Mattresses Bath and Bathtubs Immunizations Home Health Nurses Prevention and Protection Gentle Handling Clothing Lubricate the Skin Cool Environment Mittens Holding the Baby Terms and Conditions For more information about EB: Visit online–  –  – Until the diagnosis is confirmed,...»


«Your guide to kidney transplantation contents Introduction 04 Kidney transplantation 05 Before the transplant 11 The kidney transplant operation 14 Going into hospital 24 Transplant medications 31 Going home 46 More information, contact details and support services 57 2 Do you have chronic kidney disease (CKD)? Are you a patient at King’s College Hospital NHS Foundation Trust or Guy’s and St Thomas’ NHS Foundation Trust (referred to in this booklet as King’s and Guy’s)? Are you...»

<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.