Monday, November 25, 2024

conspiracy resource

Conspiracy News & Views from all angles, up-to-the-minute and uncensored

Conspiracy

Royal Society report says there’s no silver bullet to fix ‘scientific misinformation’

There’s no quick and easy way to fix the problem of ’scientific misinformation’, argues a new report released by the Royal Society. 

As communication technologies have changed, so has the scale and speed with which misinformation spreads, says the report. ‘So the responses to it have to similarly be upgraded,’ report author Frank Kelly, a mathematician at the University of Cambridge, UK, tells Chemistry World. ‘Misinformation has always been present,’ Kelly adds. ‘It’s always been difficult to balance open communication with misinformation.’

Although scientific misinformation is available online, it’s not clear what its impact is, the report says, citing a Royal Society survey of the UK public that found that found that most people believe Covid-19 vaccines are safe, climate change is the result of human activity and 5G technology is not harmful to health. During the pandemic, however, there’s ‘no question’ that there was enough misinformation to cause harm, Kelly notes. 

The report suggests that echo chambers – both online and offline – where people encounter information that reinforces beliefs they already hold are less common than previously thought. It also suggests that there is little evidence for the filter bubble hypothesis, which suggests that algorithms result in people only coming across information that is in line with their thinking.

So far, social media platforms and tech firms haven’t successfully tackled online misinformation because it’s a moving target, Kelly says. To make progress on tackling scientific misinformation, action is needed from all the different players including governments, researchers, journalists and tech companies, the report says. 

But governments and social media platforms shouldn’t rely solely on content removal as a solution to tackling scientific misinformation, the report suggests. ‘We need a sensible and effective response, which means people follow public health advice, and simply banning content risks making this harder to achieve,’ Kelly adds.

Jevin West, an information scientist at the University of Washington in the US, agrees, noting that rampant content removal ‘continues to exacerbate these levels of distrust’. Taking content down sometimes fuels discussions about the conspiracy of censorship and ends up causing more damage, he says. 

The report recommends that the UK government should support fact checking to address ‘information deserts’, where few or no authoritative voices exist. To do this, the government can approach the country’s national academies and learned societies, Kelly adds. ‘The fact checking sector is one that is quite vulnerable.’

There’s no standard way to resolve the issue of misinformation, Kelly says. ‘It will take a lot of hard work.’ Those trying to tackle the problem ‘won’t solve it, they’ll just push the balance into a better place’. It’s especially difficult in science, he notes, because determining the veracity of research claims is a long process involving authors, reviewers, editorial boards and other stakeholders.

‘Ultimately, we will need to see legislation which can address the incentives of business models that shape the algorithms determining the spread of content,’ the report says. ‘Scientists will need to work with lawyers and economists to make sure that the particular sensitivities of scientific misinformation are considered when legislation is framed.’

 

*** This article has been archived for your research. The original version from Chemistry World can be found here ***