Falsely yelling ‘Fire’ in a theater: How misinformation, confirmation bias make informing the public about COVID-19 so difficult

A good friend of mine, Ivory Hecker, down in Houston recently had one of her videos pulled from Facebook. Like most of us when facebook fails us, she went to Twitter.

Now, because I’m a scientist, I need to tell you about my biases right out of the gate: I like Ivory.

I consider Ivory a friend. She is a great reporter, a good person, and she can sing! We both were at Syracuse together. We worked on projects in the same computer room multiple times. And even critiqued each others reports from time to time.

So, I am biased.

In this case, though, despite our friendship, I disagree with her stance that misinformation should be allowed to be freely shared across social media when that misinformation can have dire consequences. And hopefully some of the evidence I share here will help folks – maybe even Ivory! – understand why the removal of misinformation is important. And how scientific-based evidence supports the removal.



You can’t falsely yell “Fire” in a movie theater

This isn’t as much science as it is history and constitutional law. Way back in 1919, Justice Oliver Wendell Holmes used this analogy when referring to “free speech” in a case, Schenck v. United States.

Carlton F.W. Larson wrote, in his paper, “Shouting ‘Fire’ in a theater: The life and times of constitutional law’s most enduring analogy” that, “The analogy, lifted by Holmes from a federal prosecutor in Cleveland, was rooted in this larger world of popular culture, which would have understood the analogy as shorthand for stupid, harmful speech

The paper was published in the William & Mary Bill of Rights Journal.

Larson continues…

“Holmes’s theater analogy is a perfect retort to the frivolous argument that all speech, regardless of context or consequences, is immunized from governmental regulation. But, in the context of Schenck, it was entirely beside the point. A false shout of “fire” in a theater is a false statement of fact; the flyers in Schenck made statements of political opinion.10 The audience in a theater is captive to a speaker in a way that the readers of the flyers in Schenck were not. The panic in the theater is immediate and not easily countered by more speech; the flyers in Schenck created no similar risk of imminent harm.”

This is where things get fuzzy. Because in the case of a theater, the only way to ‘fact-check’ the person screaming “Fire!” is to search for the fire, leaving the searcher-for-fire in potential danger. Where, on facebook, allowing potential false claims about COVID-19 up, does not immediately endanger a person, as they can simply go find new information to confirm or disprove the claims.

But it turns out, according to psychology, it may not be that simple.



Uninformed vs. Misinformed vs. Informed

Sophia McClennen, a professor of International Affairs and Comparative Literature at Penn State, made a great point during an interview with Neil DeGrasse Tyson in 2018.

“What we are seeing today, isn’t so much a distinction between ignorance and knowledge,” McClennen said during her time on StarTalk. “It is misinformed versus uninformed. Versus informed. So if you are misinformed and I try to inform you, it is not likely to go well for me. Because I will ask you to give up a thing you think is true.”

She went on to explain that misinformation, today, is worse than at any other period in time in human history. Despite the ease of access to information. And the interview was conducted well before COVID-19 was a thing.

But why is misinformation such a problem? For every actual fact, there are many, – like, almost infinity – non-facts.

As an aside, imagine you got to a magic show. And you are challenged to the game “pick a card” by a popular magician. You know the one where you randomly select one card from a 52-card deck. Alright!

Now, once you pick a card and look at it, you find that it is the Jack of Clubs. In our example here, that is a fact. You have evidence of that fact. You are holding the card in your hand.

Now you put it back in the pile. And the Magician – without looking – pulls out the Three of Hearts and says to the other people around, “I have found your card!”

The people watching applaud. The magician thanks you. The band plays that “ta-da” tune.

You contest, “No, that isn’t my card.”

Perplexed, the Magician says, “You’re right. That was the card of the contestant from last night! Well then…” and pulls out a Queen of Diamonds “Here is your card!”

The people applaud again. The band plays again. You are ushered off stage. And you contest again. This can happen up to 48 more times before the Magician actually finds your card. Because there is only one actual fact and there are 51 other not-facts. And the audience does not know what the actual fact is.

On top of that, it leaves the people in the audience with a choice. Who to believe? A magician is magic! He’s got to be right. You, on the other hand, are just some regular person.Maybe you forgot what card you had? Or you want the magician to be wrong.

This is what we are asking people to do on social media when misinformation is shared. Decide who is more trustworthy without access to the actual facts.

There is a lot of misinformation out there about COVID-19, how COVID-19 spreads, what the illness is like, and who is susceptible to infection.

So it becomes very important that when a person learns something new about this virus, that it is – in fact – accurate.

This is where the claim of “censorship” comes in, and why – a lot like falsely claiming there is a fire in a theater – curtailing the false message spread is important.

Part of the responsibility is on the platform. The other part is on the user.

As a user of facebook, twitter, or other social media, it is important to stop and think before sharing a post. Before you pass that new information on to others, you make double-certain that it has been fully vetted by sources responsible for accurately vetting information. Or else you may – unknowingly – be responsible for misinforming others. And those people may be unable to change their mind in the future.

In fact, in a paper titled, “Science audiences, misinformation, and fake news” published in April of 2019, authors found that corrective information about science is sometimes completely dismissed.

Screen Shot 2020-07-22 at 1.13.49 AM
Courtesy: PNAS

The authors wrote:

“Unfortunately, simply providing individuals with corrective, factual information is not guaranteed to fix misperceptions, as “strongly entrenched beliefs” are often likely to “survive the addition of non-supportive evidence”. In fact, there is some evidence to suggest that attempts to correct misinformation about both political and scientific topics among individuals with the most strongly held beliefs can instead backfire, entrenching them further in their false views.”

And that is troubling enough when talking about the flat earth, heat lightning, or frosts in April. But it becomes a real conundrum for scientists during a pandemic. Because, based on the findings of the study, there is no amount of accurate, factual, real data and information anyone can discuss that will change that person’s mind. Nothing.

So, getting back to the potential misinformation being spread by the doctor in Ivory’s post, that can cause major problem down the line. If someone “learns” something from that video that is inaccurate, that person may endanger their own life – or the lives of others – based on that. And that is something that during a pandemic needs to be curtailed.

There is one bit of good news from the above-mentioned paper, though. Those same authors did find that “there is some evidence that correcting misinformation via an algorithmically driven “related stories” function on social media platforms can reduce misperceptions of science-related information ”

So, for those people who say, “no one has ever changed anyone’s mind on social media!” They may want to think again (yup, I’m already testing some people with corrective information)!



Confirmation Bias

I’ve written about Confirmation Bias in the past. But what is confirmation bias? It is believing things you want to be true, despite a lack of evidence. Perhaps it is something you learned as a kid, or that you were told once by an authority figure. It is a piece of information that – without checking the validity – is believed. And defended.

Usually I am talking about confirmation bias with regards to weather-related topics. Things like heat lightning, flat earth, frosts in April. It is things that people learned at one point and then use circumstantial evidence to “prove” the thing as accurate. Even though real evidence is either not there, or not supportive.

With weather, it can be a source of head-scratching. But confirmation bias can have some pretty disappointing consequences down the line, especially for things of life and death. Like, for example, COVID-19.

If people are told that, for example, some medication can easily cure COVID-19. They may really want that to be true. So instead of looking for real evidence, they defend the medication with circumstantial evidence.

But let’s unpack this first, how do we know what is circumstantial evidence and what is real evidence? Especially with medicine?

That is a really, really -REALLY! – good question. Repeatable outcomes across multiple people is a good start for delineating between circumstantial and real. This is why we have to rely on the medical professionals for answers to a lot of the questions about COVID-19 symptoms, medications, and treatments.

Circumstantial evidence would be something, for example, like, “I know a person who got better by taking medication XX. So everyone should take medication XX to get better.”

While it is great that people want to be helpful, this isn’t actually a helpful sentiment. Because what works for one person may not work for everyone. Especially medicine. And putting chemicals in your body can have some side effects.

I know many people who take Advil for aches and pains. But I also know some people who take it, and nothing happens. For others, they take it and it gives them a nasty stomach ache. I’m sure you know people like this, too.

It is no different with medicine for COVID-19. Some things may work for some people, but not for others. And until researchers can run tests to see who it works for, who it doesn’t, and who ends up with side effects, and what those side effects may be… Doctors can’t widely distribute a medicine to inject into people.

At least, not ethically.

It isn’t political. It isn’t out of malice. It isn’t because Aliens are controlling doctor’s minds all across the globe in a plot to overthrow the American government.

It isn’t ethical. And good doctors are as ethical as possible.

So, what is real evidence?

Medical trials! Those with double-blind distributions. Ones where the people giving the medicine and taking the medicine don’t know if they are using medicine or not. This corrects for the Placebo Effect, where people feel better because you tell them they should feel better.

Studies with a large sample side are also useful. A sample size in the thousands is great! That will give researchers and idea about how a lot of people may be reacting to an illness or a medication. A lot of people may say “the law of averages” helps us here. But the “law of averages” isn’t really a thing. It is actually the “Law of large numbers” according to UCLA.

There is something called the Law of Averages(or the Law of Large Numbers) which states that if you repeat a random experiment, such as tossing a coin or rolling a die, a very large number of times, (as if you were trying to construct a population) your individual outcomes (statistics), when averaged, should be equal to (or very close to) the theoretical average (a parameter).

There is a quote “The roulette wheel has neither conscience nor memory”. Think about his quote and then consider this situation:

If you have ever visited a casino in Las Vegas and watched people play roulette, when gamblers see a streak of “Reds” come up, some will start to bet money on “Black” because they think the law of averages means that “Black” has a better chance of coming up now because they have seen so many “Reds” show up. While it is true that in the LONG RUN the proportion of Blacks and Reds will even out, in the short run, anything is possible.

So it is wrong to believe that the next few spins will “make up” for the imbalance in Blacks and Reds. The roulette wheel has no memory (and no conscience…) so it has no idea that the last say, 10 spins resulted in “Red”.

The chance is the same that it will land on “Red” on the 11th spin. Eventually in the long run (over thousands upon thousands of spins) it will even out – but remember, we live in the short run.

Basically then, if we think of each spin (or flip or attempt or draw or whatever unit you are studying) is a “trial”. The larger the number of trials, the more likely it is that the overall fraction of “successes” (i.e. the relative frequency of successes) will be close to the theoretical probability of a success in a single trial. Also with more trials, you are likely to miss the expected number of outcomes by a larger amount as measured by raw numbers, but you are likely to miss by a smaller amount in terms of percentages.

And until things like large studies or double-blind tests can be conducted, it is unethical to make claims about the guaranteed efficacy of a treatment. And dangerous. Perhaps even life-threatening.

A bit like falsely yelling “Fire” in a theater.

Can people check against the false-hood of the claims in this case? No. Because no studies have been completed. In fact, it would be like falsely yelling fire in a theater, when no evidence of fire was available. And there was no evidence that there was not a fire.

That would put the people in the theater in quite the predicament. Some people may panic and run, others may wait it out to see, and others may go looking for the answer. It would still create the panic, but with an unknown danger. Furthermore, with confirmation bias, you may have people who claim “fires don’t happen in theaters” or “fires burn quickly through theaters because they are made of very flammable material.”

Instead of allowing the panic, the eventual debate, and ‘free speech’ to occur. It is best to simply quell the person falsely yelling fire.



Allowing science to work

Just a quick note that science can take time. I know we all live in an “instant gratification” world now, but we need to try our best to have patience. I’m just as guilty as anyone else. I have Netflix, watch youtube, and listen to Pandora Radio. I download papers, get emails daily, and have a cell phone to google things when I can’t remember them.

But science takes time. We have to do the studies, have the evidence-based debates, survey people, experiment, experiment again, and then for good measure experiment one more time. Then have another debate. Then experiment some more.

Because science can’t get this wrong. The stakes are too high. It has to be right. But in order for science to get things right, it has to make sure that it goes through the necessary steps to clear out all of the failures.



tl;dr Nick… what’s the point?

There are cognitive processes in the brain that can alter how people collect and remember information. Those same processes can interfere with people’s abilities to learn new, corrective, information. So when people get misinformed with bad data first, and then need to be informed with real data second, it can become a big problem.

That is why curtailing the spread of misinformation about a virus that is killing people by the thousands – daily – is considered ‘very important’ right now by many. And why allowing the spread of misinformation under ‘free speech’ may not be the most advantageous ideal at the present time, according to science and research.



Author of the article:


Nick Lilja

Nick is former television meteorologist with stints in Amarillo and Hattiesburg. During his time in Hattiesburg, he was also an adjunct professor at the University of Southern Mississippi. He is a graduate of both Oregon State and Syracuse University that now calls Houston home. Now that he is retired from TV, he maintains this blog in his spare time.

2 thoughts on “Falsely yelling ‘Fire’ in a theater: How misinformation, confirmation bias make informing the public about COVID-19 so difficult

  1. Nick, these were real doctors in the video (not just one spoke), in real hospitals, treating real people (including children), giving hydroxychloroquine-zinc-azithromycin to treat real cases of COVID-19 AND we’re on the video to tell about it’s success.
    Has Dr Fauci ever put a hand on a sick Covid-19 patient? But he has an “investment in the other drug” and has promoted it.
    When I want something tried and true, I ask “what have you personally used and what was the result”.
    Can you not see how political this is? Money? If we could look in Billionaire Mr Facebook’s medicine drawer, I’d bet we would find a stash of hydroxychloroquine!

Comments are closed.