Monday, November 25, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

Trust in science: Who lacks it and why?

row of crossed-out chemistry beakersShare on Pinterest
Why do some people hold anti-science beliefs? Photo editing by Steve Kelly; image credit: Yulia Reznikov/Getty Images.
  • Researchers investigated the reasons behind why some people overlook scientific evidence when forming opinions.
  • They highlighted four underlying principles, alongside ways to overcome them.
  • They concluded that “scientists should be poised to empathize” with the people they try to reach to best communicate their ideas.

A poll from September 2021 suggested that 61% of Americans recognized COVID-19 as a major public health threat.

Another recent poll of Americans found a much higher rise in climate concern among Democrat-leaning respondents (27%) compared to those who were Republican-leaning (6%).

Understanding why people may overlook scientific evidence when forming opinions could help scientists and science communicators better engage the public.

Recently, researchers highlighted four key reasons why people may overlook scientific evidence when forming opinions, alongside strategies to improve communication.

“The authors echo many of the important recommendations that science communication researchers and practitioners have promoted for a long time now,” Dr. Dietram A. Scheufele, distinguished professor at the University of Wisconsin-Madison, who was not involved in the study, told Medical News Today.

“Maybe most prominently: Communicate your messages in ways that respond to rather than ridicule things that are important to the people you try to reach,” he explained.

The study appears in PNAS.

For the study, the researchers connected contemporary findings on anti-science attitudes with principles from research on attitudes, persuasion, social influence, social identity, and acceptance versus rejection of information.

In doing so, they identified four principles that underlie the rejection of scientific evidence when forming opinions:

  • source of the scientific message — when sources of scientific information, such as scientists, are perceived as inexpert or untrustworthy
  • recipient of the scientific message — when scientific information activates one’s social identity as a member of a group that holds anti-science attitudes, that has been underrepresented in science or exploited by scientific work
  • the scientific message itself — when scientific information contradicts preexisting beliefs, what people think is favorable and a preexisting sense of morality
  • mismatch between the delivery of the message and the recipient’s epistemic style — when information is delivered in ways that a reader does not conceptually understand, or that does not address their need for closure.

Dr. Bastiaan Rutjens, assistant professor of Social Psychology at the University of Amsterdam, not involved in the study, told MNT that “[i]t is important to appreciate that anti-science beliefs do not represent some monolithic entity but are rather diverse and […] reflect potentially very different attitude objects.”

“In some cases, scientific literacy is a more important antecedent and so the principle pertaining to thinking style might be more important, whereas in other cases political ideology plays a key role and yet in other cases religious or spiritual beliefs clash with scientific theories,” he noted.

To counteract the above principles, the researchers suggested several solutions. For “source of scientific message” they recommended:

  • improving the perceived validity of scientists’ work
  • conveying warmth and prosocial goals in science communication and using accessible language
  • conveying that the source is not antagonistic by portraying both sides of the argument.

To address “recipient of the scientific message,” they recommended activating a shared or superordinate identity when communicating science and engaging and collaborating with marginalized communities.

For “the scientific message itself,” the researchers recommended:

  • training in scientific reasoning
  • prebunking
  • strong arguments
  • self-affirmation
  • moral reframing
  • increasing the perceived naturalness and moral purity of scientific innovations.

Dr. Scott Morgan, associate professor of psychology at Drew University, not involved in the study, told MNT:

“The public may not always understand that science is a process of refining knowledge, and although errors happen, a scientist will update their beliefs in light of the best evidence. The public may come to believe that scientists ‘don’t know what they’re talking about’ when in fact, they are grappling with new, complex information and updating beliefs in light of new findings.”

For “mismatch between delivery and recipients’ epistemic style,” they suggested conveying information in a style that matches their way of knowing, such as “framing messages as approaching gains for promotion-focused recipients, but as avoiding losses for prevention-focused recipients.”

The researchers concluded that “scientists should be poised to empathize” with the people they try to reach to best communicate their ideas.

Dr. Scheufele added that while the study has very good intentions, it presumes that large groups of citizens are “anti-science.” He noted that, in his experience, “Americans trust science more than almost any other institution, other than the military.

“People can accurately report on what scientists consider ‘settled findings,’ but they draw very different conclusions about how that aligns with their political or religious values,” Dr. Scheufele added. “This is where the disconnects come from between the somewhat naïve sage-on-the-stage models of science communication […] and the realities of societal debates surrounding science.”

He pointed out that, while scientific studies can provide statistical evidence for different outcomes — be they public health-related or environmental — they can not tell people whether they should act accordingly. This, he thinks, is instead a political question that is “informed, but not determined, by science.”

Dr. Scheufele also noted that citizens and policymakers might have different priorities than scientists and thus prefer different methods and outcomes. “That’s not people being anti-science, those are the realities of democratic science policy-making,” he told us.

Last year, Dr. Scheufele co-authored an article warning against scientists setting out to fix “public pathologies” and build as much buy-in to new science as possible.

In his view, “[a]rtificial intelligence, brain organoids, and other disruptive breakthrough science challenge what it means to be human. In those contexts, blind societal trust in science would be as democratically undesirable as no trust at all.”

“A public that critically engages with and continuously evaluates science is crucially important as we need to make difficult political, moral, and regulatory choices for many of these new areas of science. Simply reducing against anything that doesn’t align with the preferences of the scientific establishment as ‘anti-science’ is not only simplistic, it is inherently undemocratic,” he opined.

Yet he agreed with the authors of the current study who noted that “people with more scientific literacy are simply more sophisticated at bolstering their existing beliefs by cherry-picking ideas and information to defend their worldview.”

“Ironically, this diagnosis also describes what many scientists do when they bemoan anti-science sentiments among the public: Their complaints might be more of a reflection of their own worldviews than of what public audiences are really concerned about,” he concluded.

***
This article has been archived for your research. The original version from Medical News Today can be found here.