Neil Eisberg | Editor
We live in an era when misinformation and so-called fake news are rife, not just in printed materials like newspapers but more particularly in social media, like Facebook, Twitter, and so on. And the problem concerns not just social issues but also scientific topics.
A 2016 survey by the Pew Research Center found that nearly 25% of US adults said they shared inaccurate information on social media. Some observers suggest this is likely to be much larger due to what is called social desirability bias.
Despite the explosion of misinformation, opinion surveys have also shown that public trust in science is relatively high. The National Science Board’s Science & Engineering Indicators 2020, for example, showed 44% of Americans said they had ‘a great deal of confidence’ in the scientific community, coming second only to the US military.
Currently, however, Covid vaccination has spurred an outpouring of conspiracy theories and basic misleading statements about the risks involved with the vaccines that are now becoming widely available.
As scientists we understand that with any medical product or therapy there is always a risk, just as there is with driving a car, taking a plane trip or just climbing a stepladder. Unfortunately, there is always someone who believes they know better or believes the true facts are being withheld, for whatever reason – that the risks are much bigger than anyone believes and are also being concealed.
Anyone involved with science knows there is nothing new about this phenomenon, which has exercised many involved in the chemistry and pharmaceutical sectors for many years. For the chemical sector, there have been consistent accusations that the industry puts profit above everything else from human to environmental health, something that is particularly levelled at the pharmaceutical industry.
Companies have repeatedly been accused of secrecy, despite in recent years, major attempts having been made to demonstrate transparency and openness. The effectiveness of these actions has remained debatable in spite of major initiatives and industry’s support for regulatory controls like the EU REACH legislation.
Misinformation tends to be wrapped up in simplistic and emotional form, and often presented as so-called ‘clickbait’ with attractive headlines designed to capture attention for the transmission of scandalous or erroneous information.
Emotion is usually the main driver behind the spread of untruths and unfounded accusations, and many supporters of industry have recommended that scientists should use emotion to present the real facts, just as critics and opponents have done successfully in the past. Strong emotions can impair the ability of the potential audience to process scientific information in a rational way.
In addition, misinformation is usually presented in a simplistic way, while scientists have a tendency, in an effort to be transparent, to overcomplicate the presentation of information by explaining everything in scientific terms.
Professors of communication Sara Yeo and Meaghan McKasy at the universities in Utah in the US have recently pointed out that the emotions generated by misinformation or false news can impair one’s ability to process information rationally (PNAS, doi: 10.1073/pnas.2002484118). They also point out, however, that the effect of emotion on the detection and acceptance of misinformation is not straightforward. They believe that research on emotion and as an extension, humour, in science communication reveals how they can be used as strategies to address the problem.
They point out that humour is ubiquitous in daily life and human communication, and science is no exception. They note that science jokes abound, and in this era of misinformation, humour has the potential to be a defense against fake news. But there needs to be a better understanding of humour’s role in influencing attitudes towards science.
‘Funny science can draw attention to issues that might not be on the public’s agenda and may even help direct attention to valuable and accurate information embedded within a joke,’ the researchers say. ‘Humour also impacts how we process information about science to form attitudes and behavioural intentions.’
Furthermore, they point out that humour is linked to people’s evaluation of an information source and it can humanise and make a source more likeable. Their recent research is said to show that scientists who use humour are perceived as more likeable yet retain their credibility as experts.
Yeo and McKasy say there isn’t a single or simple solution to the problem of scientific misinformation, however, they believe the best approach is to use multiple strategies together.
‘Understanding how emotion and humour shape the public’s’ understanding of science is one more resource that can aid communicators’ efforts to combat misinformation. Of course, strategies must be used ethically and how best practices are translated from research depends on the communication goals. It is essential that we engage in dialogue about the ethical considerations that face science communication in the digital media era,’ they conclude.