Combating COVID-19 misinformation: Brief infographic exposure may increase trust in science
A new study led by Indiana University has found that brief exposure to an infographic about the scientific process may…
A new study led by Indiana University has found that brief exposure to an infographic about the scientific process may have the power to strengthen people’s trust in science, including reducing the influence of COVID-19 misinformation.
The work, published Oct. 14 in the Journal of Medical Internet Research, reports that at least 60 seconds of exposure to an infographic about how scientific recommendations change in response to evidence appears to cause a small increase in a person’s trust in science. There was also some evidence that the increase in trust also reduced beliefs in COVID-19 misinformation, through what is called a mediation effect. The work has the potential to strengthen efforts against misinformation about the coronavirus, which is a major concern to health experts across the globe due to its contribution to lower vaccination rates and increased hospitalizations. “Misinformation is not unique to COVID-19 but has become both common and pervasive during the pandemic,” said Jon Agley, associate professor at the IU School of Public Health-Bloomington, who led the study. “Our study laid some groundwork to further explore the idea that providing truthful and easily digested explanations of the scientific method may reduce the influence of misinformation.” The study involved 1,017 adults recruited from the Prolific platform, composing a nationally representative U.S. sample by age, sex and race. Participants were randomly assigned to view either an “intervention infographic” about the scientific process — using the example of evolving health recommendations about butter and margarine — or a “control infographic.” Both infographics were created by the same graphic designer. Researchers measured trust using a 21-item instrument that asked people about their agreement with statements, such as “when scientists change their mind about a scientific idea, it diminishes my trust in their work.” Since lower levels of trust have been associated with higher likelihood of believing misinformation, the team looked at whether explaining essential aspects of science, like why scientists might change their minds in response to new evidence, might prevent someone from believing misinformation.
Their work built off an earlier study which found evidence that trust in science and scientists might be associated with how believable people find COVID-19 misinformation to be. “There is a lot of ongoing research and efforts to address misinformation, such as fact checking, but they often suffer from issues of scale, and issues of uncertainty in cases where existing knowledge involves some level of ambiguity or nuance,” Agley said. “In contrast, our approach, if replicable, would potentially avoid those concerns by focusing more generally on trust and the scientific enterprise. “The implication is that messaging to reduce the influence of misinformation may not need to address individual pieces of misinformation but could instead provide more general resistance to the influence of misinformation by speaking to misperceptions of science and scientists that might otherwise reduce trust.” There is still much work to be done, Agley said, but the study could lead to better ways of educating people about scientific processes around pandemics, as well as other health topics that could diminish the impact of misinformation on people’s choices.
This article is based on a press release from Indiana University.
Comments