Thursday 3 May 2018

Social construction of dis/misinformation

The presence of dis/misinformation in the knowledge economy then presents several social issues with untoward effects for public health, industry, and political discourse. As I've said before, It’s my observation that the dissemination and consumption of dis/misinformation is a social act.



Misinformation about health and medicine abounds in the digital environment. Examples of this include, antivaccination campaigns, the popularity of homeopathy despite scientific consensus, and sensationalist claims about the efficacy of unorthodox treatments. In a study of Pinterest, Guidry, Carlyle, Messner, and Jin (2015) found the majority of information shared about vaccination on the social media platform was misleading. In an analysis for The Independent, Forster (2017) found that of the most popular health-related information shared on facebook the vast majority were not only inaccurate or misleading, but also attempted to discredit trust in more authoritative sources of medical information - such as doctors and scientists. The end result of health misinformation is a lack of trust in established healthcare and a decline in rates of reporting, diagnosis, treatment, and most disturbing of all, recovery.

Health information is also susceptible to disinformation when commercial interests are involved, as Kurland (2002) says, “We have seen this with the tobacco, lead paint, petroleum, pharmaceutical, and asbestos industries,” (p. 499). This is part of an increasingly disturbing trend where information is deliberately suppressed, misrepresented, or outright fabricated by industry. Capitalism demands that we take a potentially unlimited resource (such as knowledge, or information) and make it conform to rules for scarcity. Freely accessible and accurate information is a valueless enterprise in a capitalist society. Marshall, Goodman, Zowghi, and da Rimini (2015) discuss how in a capitalist environment it is only when information becomes hidden, restricted, or privileged (regardless of accuracy) that it posses value. And so through the mechanisms of subscriptions, authentication, Digital Rights Management, and the application of intellectual property to information, we encounter a means for profit. The knowledge economy necessitates that information becomes an economic product, but because of this it becomes susceptible to manipulation for increased profit. Working within this system the problem of dis/misinformation could be partially addressed if value is placed not on information, but on accurate information. Elsewhere, the decrying of ‘fake news’ and ‘alternative facts’ is an ongoing conversation. In an opinion-piece for Global Policy, Al-Rodhan (2017) suggests a greater public presence from scientists as one of several possible countermeasures. What about a public presence of information professionals? Tarango and Machin-Mastromatteo (2017) further argue that information professionals can contribute to society by becoming knowledge producers in their own right. We might start by producing quality information that can counter dis/misinformation in the social sphere. But I should stress that there are limits to the kinds of information that we can safely and authoritatively comment on. It should also be recognised that not all of us possess the scientific competencies or vocational opportunity necessary for such work.


Photo by Kim Gorga on Unsplash

Disinformation from industry has also affected political discourse, such as global campaigns of disinformation from oil and gas corporations surrounding climate change (Brown, 2017). Van der Linden and colleagues (2017) were able to inoculate against dis/misinformation about climate change by pre-emptively highlighting false claims and refuting potential counter-arguments to create “attitudinal resistance.” It might be possible to use this method at point-of-service, during information enquiries by first drawing attention to the prevalence of dis/misinformation related to the query. However, I’m skeptical of the practical value of such an intervention, and if it would be well received. We have an ePrivacy Directive (2002), why not an eLiteracy one? What if websites and news media were required to display a message stating that some of the information contained therein may be economically motivated, personal opinion, inaccurate, or misleading?

Political discourse is especially susceptible to dis/misinformation, despite efforts to curb this. In the US, fact-checking organisations have been used to assess the accuracy of political statements since the late 1980s. Fact-checking and mythbusting organisations are useful tools for information professionals to employ when identifying socially mediated dis/misinformation. These organisations have for some time been effective at highlighting dis/misinformation in social and political spheres, often in real-time. However, Poulsen and Young (2018) note that fact-checking organisations failed in the face of misstatements (deliberate or otherwise) from the 2016 American electoral campaign because they could not keep up with the speed nor the volume of dis/misinformation at the time. Information literacy is “about taking time” to make discreet, informed decisions. The key takeaway here is that if we want to help in the identification and avoidance of dis/misinformation in political discourse, we will need to first address these new obstacles of scale and speed. In intelligence organisations political disinformation is known as PSYOPS (short for psychological operations), while military think tanks have christened the rise of digital disinformation as the ‘weaponized narrative’ (Allenby, 2017). The weaponized narrative as a tool for use in the political sphere is chillingly summarised by Carole Cadwalladr (2018);

In 2014, Steve Bannon – then executive chairman of the “alt-right” news network Breitbart – was Wylie’s boss. And Robert Mercer, the secretive US hedge-fund billionaire and Republican donor, was Cambridge Analytica’s investor. And the idea they bought into was to bring big data and social media to an established military methodology – “information operations” – then turn it on the US electorate.

This type of disinformation is difficult to identify because supported by powerful organisations and elaborate systems, it is has been designed to appear legitimate. It appears so genuine, its “ability to publish untruths and half-truths in peer-reviewed journals is destructive of the public’s capacity to make informed judgement,” (Kurland, 2002, p.499). Of the literature I’ve come across, none have attempted to describe how this particular type of disinformation can be countered, by anyone - let alone librarians. But I think that we should recognise that the consumption of dis/misinformation of any kind is a social activity. Any attempt to remedy it should be a social undertaking. Apportioning blame and responsibility to infrastructure, systems, and platforms overlooks this completely.




Allenby, B. R. (2017). The age of weaponized narrative: Or, where have you gone Walter Cronkite? Issues in Science and Technology, 33(4), 65–70.
Al-Rodhan, N. (2017, July). Post-truth politics, the fifth estate and the securitization of fake news. Global Policy. Retrieved from https://www.globalpolicyjournal.com/blog/07/06/2017/post-truth-politics-fifth-estate-and-securitization-fake-news
Brown, D. A. (2017). The enormity of the damage done by the climate change Disinformation campaign as the world struggles to implement the Paris agreement. In L. Westra, J. Gray, & F. T. Gottwald (Eds.), The role of integrity in the governance of the commons: Governance, ecology, law, ethics (pp. 125–139). Cham: Springer. https://doi.org/10.1007/978-3-319-54392-5_8
Cadwalladr, C. (2018, March 18). “I made Steve Bannon”s psychological warfare tool’: Meet the data war whistleblower. The Guardian. Retrieved from https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump
Guidry, J. P. D., Carlyle, K., Messner, M., & Jin, Y. (2015). On pins and needles: How vaccines are portrayed on Pinterest. Vaccine, 33(39), 5051–5056. https://doi.org/10.1016/J.VACCINE.2015.08.064
Kurland, J. (2002). The heart of the precautionary principle in democracy. Public Health Reports, 117(6), 498–500. https://doi.org/10.1016/S0033-3549(04)50194-2
Marshall, J. P., Goodman, J., Zowghi, D., & da Rimini, F. (2015). Disorder and the disinformation society: The social dynamics of information, networks and software. Abingdon: Routledge. https://doi.org/10.4324/9781315693460
Poulsen, S., & Young, D. G. (2018). A history of fact checking in U.S. politics and election contexts. In B. G. Southwell, E. A. Thorson, & L. Sheble (Eds.), Misinformation and mass audiences (pp. 232–248). Austin: University of Texas Press.
Tarango, J., & Machin-Mastromatteo, J. (2017). The role of information professionals in the knowledge economy. Amsterdam: Elsevier.

No comments:

Post a Comment