Saturday, November 23, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

COVID-19

Understanding and neutralising covid-19 misinformation and disinformation

  1. Yuxi Wang, research fellow1,
  2. John Bye, independent researcher2,
  3. Karam Bales, independent researcher and freelance journalist3,
  4. Deepti Gurdasani, senior lecturer in machine learning4,
  5. Adityavarman Mehta, PhD candidate5,
  6. Mohammed Abba-Aji, research fellow6,
  7. David Stuckler, professor of social and political science1,
  8. Martin McKee,, professor of European public health7
  1. 1Dondena Centre for Research on Social Dynamics and Public Policy, Department of Social and Political Science, Bocconi University, Milan, Italy
  2. 2Woking, UK
  3. 3National Education Union, London, UK
  4. 4The William Harvey Research Institute, Faculty of Medicine and Dentistry, Queen Mary University, London
  5. 5University of Leeds, Leeds, UK
  6. 6Department of Epidemiology, Boston University School of Public Health, Boston, Massachusetts, USA
  7. 7Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine, London
  1. Correspondence to: Y Wang yuxi.wang{at}unibocconi.it

Yuxi Wang and colleagues say that the public inquiry on covid-19 must look at who was opposing public health measures and why and should call on public health authorities to engage more effectively with the threats of infodemics

Key messages

  • Research on the political and commercial determinants of health points to the importance of understanding how evidence is generated and promulgated

  • During the covid-19 pandemic, several groups have been active in opposing evidence based public health measures

  • A rapid rise in misinformation and disinformation in digital and physical environments over a short period is called an “infodemic”

  • Active management of infodemics must form part of a comprehensive pandemic response

  • Further investigations into the social and public health effects of misinformation groups are needed to inform policy

Much rests on the public inquiry into the UK’s preparedness and response to the covid-19 pandemic (https://covid19.public-inquiry.uk/), with organisations and individuals scrutinised about the advice they gave and the decisions they made. The discussion will likely centre on the science, but it will also consider ideology, in particular the relation between individuals, society, and the state. When is it justifiable to impose restrictions on one group of people to protect others, for example? Some people take the view that it hardly ever is. Throughout the pandemic, some people have opposed almost all measures introduced by governments at Westminster and in the devolved administrations, from the initial lockdown to mask mandates and vaccination certificates. Their messages are similar to those promulgated by adherents to an extreme libertarian philosophy that is now prominent in some sections of society in the United States. Some benefit from generous funding from those opposed to what they term “big government,”12 and some of their messaging has been claimed to include evidence that is fabricated, distorted, or taken out of context.3 Inevitably, given the complex technical issues involved, differentiating fact from fiction can be difficult. One argument asserts that, because everyone has vested interests, including those promoting public health, all sources should be treated the same way. This was set out in the Brussels Declaration, which was drafted with substantial input from the tobacco and alcohol industries.4 But there is now a large body of evidence from researchers working on the commercial determinants of health that contradicts this,5 emphasising the importance of seeing the full picture, including who says what and that which is not said.6

The covid-19 inquiry team has now reported on the consultation about its terms of reference. Those analysing the responses found that 15% of submissions were “campaigns and duplicates.”7 This raises the question of what a campaign is. When different groups submit versions of the same text, the connection is obvious. But other links are less obvious: for example, the BBC reports that UsForThem, which has attracted high level support from politicians in its campaign against restrictions in schools, has links with the Health Advisory and Recovery Team (HART), which in turn has worked on a campaign against children being vaccinated against covid-19.8910 HART, meanwhile, shares members with groups that have opposed vaccination, such as the UK Medical Freedom Alliance and the Children’s Health Defence.

Lady Hallett, an experienced judge and chair of the covid-19 public inquiry, will be accustomed to assessing the veracity and quality of evidence presented. But it can be extremely difficult to get a complete picture of how evidence has been generated, framed, and presented.11 Understanding how the tobacco industry has distorted science, for example, has only been possible by having access to a trove of internal documents released under court order in the United States.12

Leaked online chats among individuals affiliated with some of the groups cited above shed light on links among campaigners against interventions to tackle covid-19 and some politicians, journalists, and members the scientific community.9 If the inquiry is to obtain a full picture of events during the pandemic, then it would benefit from seeing these chats. Fortunately, they are now in the public domain.

Infodemics—a key part of pandemic management

There are two types of misleading information: misinformation and disinformation. They differ in terms of intent; the latter is created with the intention of deceiving. Without additional information, such as the tobacco industry documents mentioned above, it can be difficult to differentiate between them. Their spread—often referred to as an “infodemic”—is now widely acknowledged to be a threat to the global efforts towards ending the pandemic.13 In times of crisis, people are more susceptible to misinformation, disinformation, and conspiracy theories probably because their important psychological needs are unfulfilled, leading to frustration.14

Covid-19 related misinformation and disinformation spread through society from the top down and the bottom up. One study identified politicians, celebrities, and other prominent public figures as sources of covid-19 misinformation and disinformation.15 Even though these sources produced only about 20% of the misleading information, they accounted for 69% of total social media engagement.15 Further evidence on the critical role of politicians in driving covid-19 misinformation and disinformation comes from a comprehensive survey of the traditional and online media landscape.16. The authors concluded that Donald Trump was “likely the largest driver of the covid-19 mis/disinformation ‘infodemic,’” accounting for 37.9% of mentions in the content of identified news articles.16 The Center for Countering Digital Hate, a non-profit organisation based in the UK and the US, analysed over 812 000 posts from Facebook and Twitter in the first quarter of 2021 and identified 12 people responsible for 65% of covid-19 anti-vaccine content, who they dubbed the “disinformation dozen.”17 These people include physicians who are alleged to have turned to pseudoscience, anti-vaccine entrepreneurs promoting alternative treatments, and organisations that have long opposed childhood vaccination.

Misinformation and disinformation manufactured and spread by the public can also generate substantial engagement,1518 so strategies aimed at tackling infodemics should target both top-down and bottom-up spread. In doing so, it is essential to understand the nature of any misinformation and disinformation being promoted as it has the potential to spread fear and possibly cost lives.19 A substantial majority (88%) of the false or misleading claims identified by Simon and colleagues were on social media platforms; television, news outlets, and other websites accounted for 9%, 8%, and 7%, respectively.15 The misleading content that received the highest engagement (29%) typically contained a small degree of accurate information that was re-contextualised and twisted; misinformation and disinformation that included doctored images and videos received the next highest (24%).

Evanega et al looked at 38 million traditional media news articles published in English worldwide. The top three most prevalent misinformation, disinformation, and conspiracy theories related to miracle cures for covid-19, conspiracies involving “deep state” actors paying prominent figures associated with the response to covid-19, and the US Democratic Party manufacturing covid-19 to coincide with Trump’s impeachment. The Wuhan laboratory being a secret bioweapons facility, Bill Gates having foreknowledge of the pandemic, and 5G technology having deleterious health effects were also mentioned.16

A national survey of US adults further investigated the popularity of different types of misinformation. 20 It showed that conspiracy theories endorsed by visible partisan figures received higher levels of support, measured by participant’s beliefs in the misinformation, than non-partisan medical misinformation about the treatment and transmission of covid-19.20 This indicates that people are more likely to believe abstract theories about the nefarious motives of political figures than they are to believe potentially harmful but non-ideological health misinformation.20 Moreover, misinformation and disinformation with a higher degree of generalisability is more likely to get traction than specific information; 29% of Americans believe that the number of covid-19 deaths has been exaggerated, whereas only 13% of Americans support the claim that Bill Gates is responsible for the pandemic.20

What can be done?

The public inquiry must identify lessons that can be learnt before the next pandemic. One such lesson is likely to be the need to develop strategic approaches to tackle disinformation and conspiracy theories. Long before the existence of social media platforms, researchers investigated how to mitigate the effect of exposure to false information.2122 Traditional measures used in the past include exposure to corrective advertising through mass media, content labelling the accuracy of information on consumer products,23 and correcting misinformation and disinformation about certain public services.24

The advent of social media and online platforms has provided a fertile medium for disinformation to flourish. Recent studies have looked at the effectiveness of several types of intervention, including redirection, content labelling, content distribution and sharing, disinformation disclosure, disinformation literacy, advertisement policy, content or account moderation, and security and verification. One literature search examined studies on the effectiveness of different types of countermeasures against disinformation campaigns.25 Looking at outcomes such as beliefs, intended behaviour, knowledge, and observed behaviour, the studies indicate that fact checking can reduce the influence of exposure to false information on people’s beliefs as well as their propensity to share misinformation and disinformation.25 In terms of fact checking interventions, most of the included studies evaluated the effects of disinformation disclosure, which is when the platform informs a user that they have come in contact, shared, or interacted with disinformation; many others studied content labelling using a fact checking tag, funding tag or outdated tag, and some examined interventions that educate users to identify disinformation.25 Although most of these countermeasures are proved effective, they don’t represent the major interventions used by social media platforms in the real world, such as content moderation (removal or suspension of account or content).25

Using randomised experiments based on a hypothetical scenario that includes information that is later refuted, two studies in cognitive psychology identified a “continued influence effect” of misinformation.2627 Even after retraction or warning that certain information was incorrect, the retracted facts continued to stick to memory and shape how some people interpreted events.2627 Schmid and Betsch conducted six experiments to assess how to mitigate the influence of science deniers on an audience.28 The participants were randomly assigned to different levels of rebuttal conditions after being exposed to a public discussion with a science denier of vaccination or climate change.28 The internal meta-analysis across all six experiment shows that not responding to science deniers decreases attitudes to behaviours supported by science (such as vaccination) and reduces intentions to perform these behaviours.28 They also found that providing facts or uncovering rhetorical techniques, such as conspiracy theories, false experts, and impossible expectations, tend to be the most effective and universal tool for science advocates.29 But the risk of backfire effects—where correction of a falsehood can reinforce belief in it among those whose beliefs or political ideologies are threatened by the facts—must be considered.2430

Another approach is psychological inoculation or “prebunking”—exposing people to a weakened dose of a persuasive but false argument to trigger the “immune system.”31 Studies have shown that inducing people to think about accuracy or inoculating against misinformation and disinformation can reduce susceptibility and sharing.30313233 When reading this literature, however, one must differentiate the effects on beliefs, intended behaviour, and knowledge.25 Moreover, the existing literature primarily reports on experimental designs in laboratory or survey settings, with relatively little research on real world behaviours.2634 Empirical studies on the nature of and countermeasures against groups promoting misinformation and disinformation that have gained political and social influence are still lacking.

Finally, legal interventions are being experimented by governments in the real world. A bill in California will allow regulators to punish doctors for spreading false information about covid-19 vaccines and treatments by revoking the license to practise.35 A separate bill seeks to require online platforms such as Facebook to publicly disclose their algorithms on content moderation to determine how disinformation is amplified.36 Given the lack of transparency in allowing academic researchers to examine the potential harms of these platforms, more regulatory actions may be the appropriate course of action.

What should the inquiry focus on?

The public inquiry should do three things. Firstly, it should examine the extent to which groups promoting contrarian messages were able to influence policy. We think it unlikely that they were able to do so directly but, given their links to the media and influential politicians, they should be investigated. Secondly, it should inquire into how effective the government was in countering misinformation and disinformation and whether it drew on cognitive science to devise interventions. Data from the Association of School and College leaders, for example, indicate that eight in 10 schools were targeted by anti-vaccine protesters.37 Anti-vaccine protests also targeted parents and students at school gates. The inquiry should examine whether steps were taken to mitigate the impact of these protests, such as disclosing rhetorical techniques these groups employed to induce fear among parents. Thirdly, to what extent did weaknesses in the government and public health organisations’ (UK Health Security Agency, Joint Committee on Vaccination and Immunisation) messaging (around masks/childhood vaccines) leave space for online misinformation and disinformation to take hold?

Discussion

Historically, science denialism has caused people to refuse preventative measures like immunisation or life saving HIV/AIDS medications, which has distorted attitudes and resulted in years of severe illness and death 28.38 Recent false or misleading covid-19 narratives promoted by some groups to discredit legitimate public health measures, in particular non-pharmacological interventions, may have likewise contributed to preventable illness and death and those responsible must be held legally accountable. Children who could have been protected (as they were in many other European countries) have been unnecessarily exposed to a virus that can have long term effects on multiple organs in the body. Long covid has risen substantially in children and young people39 after consecutive waves of infection. The scientific community and government institutions are not immune to dangerous ideologies and influence operations.

We hope that the information we have included here—on the nature and activities of groups that have opposed measures to reduce transmission of covid-19 and what can be done to tackle them— will be of use to the public inquiry. Fact checking and labelling sources of information clearly have a role. Maybe public health authorities should also do more to expose the methods used by groups promulgating misinformation and devise more effective ways to counter their messaging. The existing Online Safety Bill, recently introduced to the House of Commons, should also explicitly list those who have benefitted financially from the spread of covid-19 related misinformation and disinformation.40 Politicians and parliamentary committees seeking scientific advice must also be transparent about how advisers and experts are chosen, especially when partisan narratives are prominent.

Questions for the public inquiry

  • To what extent were groups promoting contrarian messages against scientific evidence able to influence policy?

  • How effective was the government in countering misinformation and disinformation campaigns (and did they draw on cognitive psychology and media studies)?

  • To what extent did weaknesses in public messaging leave space for online misinformation and disinformation to take hold?

Footnotes

  • Contributors and sources: The authors have all been involved in researching how denialism and misinformation groups influenced the debate of covid-19 mitigation policies. DS, MM, YW, JB, KB, and DG conceptualised and drafted this paper. AM, MA, and YW gathered academic literature to support the evidence. MM and YW revised the article in response to comments by reviewers. DS is the guarantor of the article.

  • Competing interests: We have read and understood BMJ policy on declaration of interests and have no conflict of interest to declare.

  • This article is part of a series commissioned, peer reviewed, and edited by The BMJ (https://www.bmj.com/covid-inquiry). The advisory group for the series was chaired by Kara Hanson, and included Martin McKee, although he was not involved in the decision making on the papers that he co-authored. Kamran Abbasi was the lead editor for The BMJ.

This article is made freely available for personal use in accordance with BMJ’s website terms and conditions for the duration of the covid-19 pandemic or until otherwise determined by BMJ. You may download and print the article for any lawful, non-commercial purpose (including text and data mining) provided that all copyright notices and trade marks are retained.

https://bmj.com/coronavirus/usage

***
This article has been archived for your research. The original version from The BMJ can be found here.