Sydney Initiative for Truth | Ideas spread like disease: Let’s treat them with the same caution
Truth
17473
post-template-default,single,single-post,postid-17473,single-format-standard,bridge-core-1.0.4,ajax_fade,page_not_loaded,,qode_grid_1200,qode-content-sidebar-responsive,qode-child-theme-ver-1.0.0,qode-theme-ver-18.0.7,qode-theme-bridge,disabled_footer_bottom,wpb-js-composer js-comp-ver-5.7,vc_responsive

Ideas spread like disease: Let’s treat them with the same caution

By Nick Enfield

When you “like” a story online, you’re not just telling your social media followers that you like it, you’re also exposing them to that story. And they, in turn, can expose others, and so on. We are interconnected in ways we can hardly imagine, and our little online actions can have big consequences. That can be a good thing, if the stories we share contain valuable information or ideas. But what if the information is false? Falsehoods are dangerous, and when they spread they can cause real harm. Yet we seem blindly willing to share stories whose truth we are not sure of.

Why do we transmit false information? One reason is that we too easily believe that it’s actually true. We all suffer from confirmation bias, the readiness to accept evidence that confirms our views, and to reject evidence that contradicts them. Another reason we transmit falsehoods is that we often don’t care if a story is true or not. If we treat information as more entertainment than news, then we share what pleases us, without considering what might happen if others believe it is true.

It reveals a fundamental lack of care in how we handle information. This is particularly troubling in the social media age. We have suddenly acquired unprecedented powers to transmit information in an instant to millions of others. But we have yet to learn how to handle that power mindfully and ethically.

Compare this to the history of infectious disease. Changes in living conditions through population explosions and urbanisation introduced drastic new risks of infection, from cholera to the bubonic plague. With no knowledge of how diseases actually spread, people actively contributed to the crisis with dangerous behaviour such as poor sanitation. But in time, scientific breakthroughs in our understanding of how diseases spread not only revolutionised our day-to-day practices – from washing our hands to vaccinating our children – they set new standards of accountability in our individual behaviour. As it became commonly known that germs carry disease, we learned that sneezing on to people could be harmful to their health. And as it became known that vaccination works, we learned that it protects our kids and ourselves as well as the broader population.

As this example shows, when a radical change in our social conditions produces a serious threat, we are ultimately able to handle it by acting both rationally and ethically. People learn to exercise individual responsibility, in the common interest.

We need to apply the same ethical reason in our handling of information in this age of truth decay.

While the internet era has supercharged the problem, the fundamental issue is not new. In his 1877 article “The Ethics of Belief”, philosopher William K. Clifford argued that “it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence”. Among his reasons for saying this were that it compromised our shared culture of respect for evidence and reason: “The danger to society is not merely that it should believe wrong things, though that is great enough; but that it should become credulous, and lose the habit of testing things and inquiring into them”.

We will one day look back and be amazed at the reckless way in which people treated information in these early days of social media, passing stories on without knowing or caring whether they were true. What we urgently need now are advances in information literacy. This must start with a true appreciation of our susceptibility to falsehood and its dangers, and it must lead to an individual sense of duty to pause, think, and check before passing on information.

Bioethicist Eric Kodish wrote that “parents who choose not to immunize their children are ethically negligent”, lowering the threshold for group immunity to infection. In the same way, when we share false information we are ethically negligent, polluting the infosphere and lowering the threshold for group immunity to bullshit.

Nick Enfield is professor of linguistics at the University of Sydney, and head of the Sydney Initiative for Truth (SIFT). He will appear in the panel Insight: Truth in an Uncertain Age on 30 April

This article was originally published in The Guardian.

Image Credit: Gerd Altmann from Pixabay.
No Comments

Sorry, the comment form is closed at this time.