Alexandra Tselios

Carl Sagan’s nine steps to combat fake news in your living room

Sagan

Despite information being freely available, many of us still choose to believe nonsense over objective fact. Carl Sagan had a solid methodology of calling BS. Maybe we should use it.

 

 

One would think that, in an age where information is only a Siri question or click away, we would be dedicated to believing what we believe is based in fact. Surely, we aren’t like those in the embarrassing halls of history who believed California was an island separated from North America? We aren’t silly enough, are we, to still believe that jamming a pin into one’s eye socket would release one from depression, that Paul McCartney really died in 1966 or that women’s brains are incapable of retaining education like a male brain can (thanks for that one, 1873 Harvard Medical School’s Edward Clarke)?

No, we are far more advanced now and capable of accessing information easily and efficiently to ensure we are making critical choices when taking on the beliefs that we then discuss at the dinner table or share online. Why is it then that despite knowledge being so accessible, we still have family members who come out as flat earth theorists, anti-vaxxers or Alex Jones supporters? If you value critical thinking, it’s easy to mock conspiracy theorists and assume you are far superior intellectually; but that move is far too simplistic, especially when taking into account the research into why people believe in conspiracy theories. When the person exposed as a conspiracy theorist is someone you trust and/or respect, a family member perhaps, or someone you’d previously assumed to be of sound mind, it is also harder to simply brand him or her as simple, uneducated or ignorant. It may also be unfair to assume intellectual inferiority, in light of the research into what draws some into believing outlandish speculation, especially when led astray by beloved celebrities and athletes who advocate ridiculous theories. Very often acceptance of conspiracy theories is driven by a need to cultivate a world where some sense of safety and security exists; and some other psychological factors have also been suggested:

“There may be a set of cognitive tendencies that combine with or augment the association between broader or more motivation- and emotion-based personality traits on conspiracy beliefs. In other words, conspiracy mentality may in part reflect particular information-processing dispositions. For example, people who are prone to detecting agency – intention – behind events and actions should be more likely to entertain the possibility of a conspiracy, and research supports this hypothesis.” (Douglas, Sutton, Callan, Dawtry and Harvey, 2016; van der Tempel and Alcock, 2015).

 

Strange beliefs held in pre-Internet days include:

Drinking gladiators’ blood or consuming their livers could cure medical ailments, such as epilepsy.

California was an island, and not part of North America.

Sexy thoughts were a symptom of hysteria amongst women caused by a wondering womb.

The world is flat .

 

Yet, strange beliefs are still occurring in the digital age, such as:

Melania Trump has a doppelgänger that appears with Donald Trump on official business.

Chemicals in the water are turning frogs gay.

Harvey Weinstein was publicly outed as a predator because he was creating an anti-vaxx documentary and big pharma wanted to shut him down.

The world is flat.

When it comes to engaging in dinner conversation with someone whose theories seem off, regardless of subject, it is worth taking into consideration a chapter from the 1995 book by astrophysicist Carl Sagan titled, The Demon-Haunted World: Science as a Candle in the Dark. The book focused in part on some of the theories people had that did not stack up as evidence-based, and Sagan suggested nine steps to assist in determining how rubbish or otherwise someone’s beliefs are (one’s own included).

Sagan’s nine steps can assist you in determining the validity of an argument, spotting fake news, and in countering a ridiculous argument without completely annihilating the person and ruining Christmas dinner.

The steps are simple:

  1. Wherever possible, there must be independent confirmation of the “facts”.
  2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  3. Arguments from authority carry little weight. “Authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it might be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses”, has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
  5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course, there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
  7. If there’s a chain of argument, every link in the chain must work – not just most of them to suit a narrative.
  8. Consider Occam’s Razor. This is the principle that we must keep explanation as simple as possible and avoid superfluous assumptions or reasoning. When faced with two hypotheses that explain data equally well, choose the simpler explanation.
  9. Always ask whether the hypothesis can be falsified, at least in principle, and if so, how.

 

The last interview Carl Sagan ever gave, with Charlie Rose, he said: “Science is more than a body of knowledge. It is a way of thinking; a way of sceptically interrogating the universe with a fine understanding of human fallibility… If we are not able to ask sceptical questions, to interrogate those who tell us that something is true, to be sceptical of those in authority, then, we are up for grabs for the next charlatan (political or religious) who comes rambling along.”

Sagan’s steps are not easy to casually weave into a conversation, nor is it easy to ask sceptical questions of those you respect and love. That becomes even more difficult when broaching subjects which are a particularly emotive topic such as children and vaccinations, personal health issues or political bias. It is misguided also, to speak to the person as though they are intellectually inferior, as even some of the most brilliant minds have got theories wrong – Einstein, for example, abandoned his static universe theory when he recognised that the cosmological constant his equations required was theoretically unsatisfactory and no longer necessary.

But we owe it to ourselves to try to remain as rational as possible and, if nothing else, to shine some light of logic for those too afraid to do so for themselves. At least they then have a choice.

Hear Alexandra discuss this on Triple M:

Alexandra Tselios

Founder and CEO of The Big Smoke, Alexandra oversees the leading digital content platform in both Australia and the USA. As a social and technology commentator, she is interviewed most days of the week on radio and appears on ABC's The Drum and ABC News24. Alexandra is also a Director of NFP think tank, Plus61J, which explores the political and social ties between Australia and Israel; and sits on the board of Estate-Planning FinTech start-up NowSorted.

Related posts

Top