I recently listened to a podcast episode in which the guest was discussing his newly released book. The host is a self-proclaimed skeptic. The topic related to the COVID-19 pandemic. The episode’s focus was presented as a question of whether a particular policy helped or harmed. It was described as a postmortem analysis. We all lived the collective experience, and we all most likely agree that not everything went as well as we could have hoped. I was anticipating a good discussion.
What I heard was disappointing, difficult to listen to, even. Why? In short, because it became clear that the guest had a particular point of view based on his personal experience during COVID-19, and the host did not question the guest as one might expect a skeptic to do. It seemed the host had read the book and liked it, so their approximately hour-long discussion did not explore various issues related to the topic. In fact, as the guest increasingly realized he and the host agreed, his voice rose, he became more animated, and he began talking faster. He was on a soapbox. Rather than a discussion or exploration of the topic, the guest had a stage and a megaphone, preaching to his audience of one and any listeners who happened to be along for the ride. Listening to the episode became stressful for me — not because of the content, although some of what the guest said should have been questioned — but because of the guest’s tone and his anger that became increasingly apparent as the episode progressed.
Separating what we know from how we feel
It’s OK to feel anger around decisions made during COVID-19 — or other policy-related issues for that matter. Anger is a motivating emotion. Indeed, it seems anger motivated the podcast guest to write a book. But it’s important to distinguish between what we know about a topic and how we feel about a topic because understanding that distinction about our own position can help us ask better questions, more clearly assess the totality of evidence, and inform any personal decisions we need to make related to the information. Further, thinking back to the podcast episode, it’s especially important to question this if we’re presenting ourselves as someone with more knowledge or authority on a subject. Was this author able to research and write an accurate assessment of the COVID-19 situation or was everything filtered through his lingering anger? Whether podcast audience members or the listening public at large, we should hold figures of authority to account when we’re listening to them because if they can’t separate their emotions from the facts, can we be sure they’ve considered different points of view on the matter at hand? Are they presenting us with an accurate assessment of the topic or situation?
Further complicating these assessments is a common buzzword related to today’s information environment — misinformation. We know that we are living in an environment in which we are regularly exposed to inaccurate information, including misinformation about knowledge gained through science. We are also aware that misinformation is shared purposefully and sometimes in a way aimed at manipulating how we feel. As such, not only do we need to assess information by type (what we know or how we feel), we also need to assess whether the message accurately reflects what is known about the topic and whether the message has been designed to manipulate how we feel.
Dissecting misinformation
Recently, the National Academies of Sciences, Engineering and Medicine published a report, “Understanding and Addressing Misinformation About Science.” The publication resulted from the work of a committee charged with characterizing misinformation about science, proposing solutions to address it, and assessing what we still need to understand to reduce the harm it causes.
In defining misinformation about science, the authors noted some important points:
- Misinformation is an umbrella term used in different ways to define different types of inaccurate, false or misleading information.
- Because a cornerstone of science involves the evolution of knowledge over time, what is accurate at a given time may change. This does not mean that previous statements should be classified as misinformation. Rather, any definition for misinformation about science needs to consider the scientific consensus at the time.
- Related to our earlier discussion, the authors also discussed boundaries that should be considered when defining misinformation, including:
- Distinguishing between knowledge and values (what we know versus what we feel).
- Considering the intent and context in which scientific information is being offered. For example, if information is simplified to educate or improve understanding, its valuation as misinformation might be inappropriate; however, if information is cherry-picked to tell only part of a story that agrees with a particular point of view, its valuation as misinformation might be appropriate. The authors acknowledged that at times defining these boundaries can be difficult.
Taking these and other considerations into account, the authors defined misinformation about science as: “information that asserts or implies claims that are inconsistent with the weight of accepted scientific evidence at the time (reflecting both quality and quantity of evidence).” (p. 42)
What we know, how we feel, and misinformation: An example
Science offers us a way of knowing about the world in which we live, but policies need to take more than science into account. Said another way, we don’t all have to agree on what we should do with or about particular scientific knowledge. But increasingly, science is dismissed as “just another opinion” or “one interpretation of the facts” when, in reality, science gives us a common starting point. We can use vaccines as an example.
Science has provided proof that specific germs cause specific diseases (i.e., germ theory), and it has taught us that vaccines based on specific germs can train our immune systems to decrease our chance of getting those specific diseases. These statements are what we know. Whether a person wants to get a vaccine is based on how they feel. How a person feels takes other considerations into account, and how a person feels may change over time. But what doesn’t change is that getting vaccinated decreases the recipient’s chance of getting the disease that the vaccine protects against.
Continuing with our example, let’s consider misinformation. Maybe someone suggests that eating healthy and taking care of yourself can prevent specific infectious diseases. This would appear to be a what you know statement. And we probably all agree that as a society, we know eating healthy foods and taking care of ourselves is good for us. But … the statement was that these efforts could prevent specific infectious diseases, and that is not accurate. The only way to prevent infectious diseases is to have germ-specific immunity. We get that kind of immunity in one of two ways: vaccination or infection. So, suggesting healthy living as an alternative to vaccination is not an accurate portrayal of what we know. Further, this may be an attempt to manipulate how we feel. A statement like this has a kernel of truth, so people may assume the rest is correct too, essentially grandfathering in the misleading information because the first part of the statement aligns with something they already know. Likewise, this statement may appeal to someone who already wants to avoid getting vaccinated, or it may seem like evidence that vaccines are not necessary for someone who doesn’t have much of an opinion about vaccines. Regardless of how it lands, the statement misinforms because choosing healthy living instead of vaccination means remaining at greater risk for diseases than one needs to be. (As an aside, this misleading healthy living appeal is often accompanied by another equally misleading appeal related to the benefits of natural infection, but I digress.)
Try it!
So now that we have considered the differences between knowledge (what we know), feelings (what we feel) and misinformation, see if you can spot them in the statements below. Note: Some statements include more than one example.
Q1. They said lay babies on their stomachs, and now they say lay babies on their backs. That just shows you they are making it up as they go.
Answer: Knowledge and feelings. The first sentence is knowledge based on science. As the science evolved, the recommendations changed. The second sentence is based on feelings. Importantly, if you hear someone making these two statements together, you should question whether their emotions are interfering with their ability to logically evaluate the facts they are sharing.
Q2. Taking vitamin A can dramatically reduce deaths from measles.
Answer: Contextual misinformation. This is an example of the boundaries of context when it comes to misinformation. Studies in low-income countries have shown that when malnourished children are given vitamin A as part of the treatment for measles, it reduces their chance of dying. But in the U.S., the data are not as robust, likely because most children are not malnourished. Recommendations in the U.S. only include giving vitamin A to children hospitalized with measles, and the dosing is quite specific, particularly given that high quantities of vitamin A can be toxic.
Q3. Hepatitis B vaccine shouldn’t be given to newborns because people get that disease from sex or drugs.
Answer: Feelings. While it is true that most cases of hepatitis B are the result of exposure to contaminated blood during intimacy or from sharing needles, people get infected with hepatitis B by exposure to blood in other ways as well. This statement is based on the individual’s feelings because it’s rooted in their disagreement with a policy based on science.
Q4. Rates of hospitalization for respiratory syncytial virus (RSV) in 0- to 2-month-old infants decreased by about 50% in the 2024-2025 season compared with rates prior to availability of maternal vaccination and a long-lasting monoclonal antibody for infants to protect against RSV.
Answer: Knowledge. This statement is based on a study conducted by the Centers for Disease Control and Prevention (CDC) to evaluate the effects of recommendations for maternal vaccination or infant receipt of nirsevimab, a long-lasting monoclonal antibody, to protect against RSV.
Q5. I’m not getting the COVID-19 vaccine because it doesn’t work. My mom got vaccinated, and she still got COVID.
Answer: Feelings and misinformation. The first sentence includes the individual’s personal feelings about getting vaccinated and an inaccurate statement to provide a reason. The second sentence is offered as proof of that reason, and it is likely a true statement. While it is the case that people vaccinated against COVID-19 can still be infected, that doesn’t mean the vaccine doesn’t work because whether the vaccine works is not an “all or nothing” scenario. This is an example of a logical fallacy called false dichotomy. False dichotomy suggests that there are only two options; in this case the vaccine works, or it doesn’t. But that is not the case. Studies have demonstrated that vaccinated people are less likely to get severely ill, less likely to experience lingering symptoms (“long COVID”), and less likely to be hospitalized or die if they are infected with the virus that causes COVID-19.
Hopefully, you enjoyed trying to sort out the basis for these examples, and the next time you are chatting with a friend, scrolling through your social media feed — or maybe, listening to a podcast — you’ll find yourself checking for statements of knowledge, feelings or misinformation.
Resources
- Understanding and Addressing Misinformation About Science
- “Evaluating Information” website section
- Logical Fallacies: What You Should Know
Download a PDF version of this article.
I recently listened to a podcast episode in which the guest was discussing his newly released book. The host is a self-proclaimed skeptic. The topic related to the COVID-19 pandemic. The episode’s focus was presented as a question of whether a particular policy helped or harmed. It was described as a postmortem analysis. We all lived the collective experience, and we all most likely agree that not everything went as well as we could have hoped. I was anticipating a good discussion.
What I heard was disappointing, difficult to listen to, even. Why? In short, because it became clear that the guest had a particular point of view based on his personal experience during COVID-19, and the host did not question the guest as one might expect a skeptic to do. It seemed the host had read the book and liked it, so their approximately hour-long discussion did not explore various issues related to the topic. In fact, as the guest increasingly realized he and the host agreed, his voice rose, he became more animated, and he began talking faster. He was on a soapbox. Rather than a discussion or exploration of the topic, the guest had a stage and a megaphone, preaching to his audience of one and any listeners who happened to be along for the ride. Listening to the episode became stressful for me — not because of the content, although some of what the guest said should have been questioned — but because of the guest’s tone and his anger that became increasingly apparent as the episode progressed.
Separating what we know from how we feel
It’s OK to feel anger around decisions made during COVID-19 — or other policy-related issues for that matter. Anger is a motivating emotion. Indeed, it seems anger motivated the podcast guest to write a book. But it’s important to distinguish between what we know about a topic and how we feel about a topic because understanding that distinction about our own position can help us ask better questions, more clearly assess the totality of evidence, and inform any personal decisions we need to make related to the information. Further, thinking back to the podcast episode, it’s especially important to question this if we’re presenting ourselves as someone with more knowledge or authority on a subject. Was this author able to research and write an accurate assessment of the COVID-19 situation or was everything filtered through his lingering anger? Whether podcast audience members or the listening public at large, we should hold figures of authority to account when we’re listening to them because if they can’t separate their emotions from the facts, can we be sure they’ve considered different points of view on the matter at hand? Are they presenting us with an accurate assessment of the topic or situation?
Further complicating these assessments is a common buzzword related to today’s information environment — misinformation. We know that we are living in an environment in which we are regularly exposed to inaccurate information, including misinformation about knowledge gained through science. We are also aware that misinformation is shared purposefully and sometimes in a way aimed at manipulating how we feel. As such, not only do we need to assess information by type (what we know or how we feel), we also need to assess whether the message accurately reflects what is known about the topic and whether the message has been designed to manipulate how we feel.
Dissecting misinformation
Recently, the National Academies of Sciences, Engineering and Medicine published a report, “Understanding and Addressing Misinformation About Science.” The publication resulted from the work of a committee charged with characterizing misinformation about science, proposing solutions to address it, and assessing what we still need to understand to reduce the harm it causes.
In defining misinformation about science, the authors noted some important points:
- Misinformation is an umbrella term used in different ways to define different types of inaccurate, false or misleading information.
- Because a cornerstone of science involves the evolution of knowledge over time, what is accurate at a given time may change. This does not mean that previous statements should be classified as misinformation. Rather, any definition for misinformation about science needs to consider the scientific consensus at the time.
- Related to our earlier discussion, the authors also discussed boundaries that should be considered when defining misinformation, including:
- Distinguishing between knowledge and values (what we know versus what we feel).
- Considering the intent and context in which scientific information is being offered. For example, if information is simplified to educate or improve understanding, its valuation as misinformation might be inappropriate; however, if information is cherry-picked to tell only part of a story that agrees with a particular point of view, its valuation as misinformation might be appropriate. The authors acknowledged that at times defining these boundaries can be difficult.
Taking these and other considerations into account, the authors defined misinformation about science as: “information that asserts or implies claims that are inconsistent with the weight of accepted scientific evidence at the time (reflecting both quality and quantity of evidence).” (p. 42)
What we know, how we feel, and misinformation: An example
Science offers us a way of knowing about the world in which we live, but policies need to take more than science into account. Said another way, we don’t all have to agree on what we should do with or about particular scientific knowledge. But increasingly, science is dismissed as “just another opinion” or “one interpretation of the facts” when, in reality, science gives us a common starting point. We can use vaccines as an example.
Science has provided proof that specific germs cause specific diseases (i.e., germ theory), and it has taught us that vaccines based on specific germs can train our immune systems to decrease our chance of getting those specific diseases. These statements are what we know. Whether a person wants to get a vaccine is based on how they feel. How a person feels takes other considerations into account, and how a person feels may change over time. But what doesn’t change is that getting vaccinated decreases the recipient’s chance of getting the disease that the vaccine protects against.
Continuing with our example, let’s consider misinformation. Maybe someone suggests that eating healthy and taking care of yourself can prevent specific infectious diseases. This would appear to be a what you know statement. And we probably all agree that as a society, we know eating healthy foods and taking care of ourselves is good for us. But … the statement was that these efforts could prevent specific infectious diseases, and that is not accurate. The only way to prevent infectious diseases is to have germ-specific immunity. We get that kind of immunity in one of two ways: vaccination or infection. So, suggesting healthy living as an alternative to vaccination is not an accurate portrayal of what we know. Further, this may be an attempt to manipulate how we feel. A statement like this has a kernel of truth, so people may assume the rest is correct too, essentially grandfathering in the misleading information because the first part of the statement aligns with something they already know. Likewise, this statement may appeal to someone who already wants to avoid getting vaccinated, or it may seem like evidence that vaccines are not necessary for someone who doesn’t have much of an opinion about vaccines. Regardless of how it lands, the statement misinforms because choosing healthy living instead of vaccination means remaining at greater risk for diseases than one needs to be. (As an aside, this misleading healthy living appeal is often accompanied by another equally misleading appeal related to the benefits of natural infection, but I digress.)
Try it!
So now that we have considered the differences between knowledge (what we know), feelings (what we feel) and misinformation, see if you can spot them in the statements below. Note: Some statements include more than one example.
Q1. They said lay babies on their stomachs, and now they say lay babies on their backs. That just shows you they are making it up as they go.
Answer: Knowledge and feelings. The first sentence is knowledge based on science. As the science evolved, the recommendations changed. The second sentence is based on feelings. Importantly, if you hear someone making these two statements together, you should question whether their emotions are interfering with their ability to logically evaluate the facts they are sharing.
Q2. Taking vitamin A can dramatically reduce deaths from measles.
Answer: Contextual misinformation. This is an example of the boundaries of context when it comes to misinformation. Studies in low-income countries have shown that when malnourished children are given vitamin A as part of the treatment for measles, it reduces their chance of dying. But in the U.S., the data are not as robust, likely because most children are not malnourished. Recommendations in the U.S. only include giving vitamin A to children hospitalized with measles, and the dosing is quite specific, particularly given that high quantities of vitamin A can be toxic.
Q3. Hepatitis B vaccine shouldn’t be given to newborns because people get that disease from sex or drugs.
Answer: Feelings. While it is true that most cases of hepatitis B are the result of exposure to contaminated blood during intimacy or from sharing needles, people get infected with hepatitis B by exposure to blood in other ways as well. This statement is based on the individual’s feelings because it’s rooted in their disagreement with a policy based on science.
Q4. Rates of hospitalization for respiratory syncytial virus (RSV) in 0- to 2-month-old infants decreased by about 50% in the 2024-2025 season compared with rates prior to availability of maternal vaccination and a long-lasting monoclonal antibody for infants to protect against RSV.
Answer: Knowledge. This statement is based on a study conducted by the Centers for Disease Control and Prevention (CDC) to evaluate the effects of recommendations for maternal vaccination or infant receipt of nirsevimab, a long-lasting monoclonal antibody, to protect against RSV.
Q5. I’m not getting the COVID-19 vaccine because it doesn’t work. My mom got vaccinated, and she still got COVID.
Answer: Feelings and misinformation. The first sentence includes the individual’s personal feelings about getting vaccinated and an inaccurate statement to provide a reason. The second sentence is offered as proof of that reason, and it is likely a true statement. While it is the case that people vaccinated against COVID-19 can still be infected, that doesn’t mean the vaccine doesn’t work because whether the vaccine works is not an “all or nothing” scenario. This is an example of a logical fallacy called false dichotomy. False dichotomy suggests that there are only two options; in this case the vaccine works, or it doesn’t. But that is not the case. Studies have demonstrated that vaccinated people are less likely to get severely ill, less likely to experience lingering symptoms (“long COVID”), and less likely to be hospitalized or die if they are infected with the virus that causes COVID-19.
Hopefully, you enjoyed trying to sort out the basis for these examples, and the next time you are chatting with a friend, scrolling through your social media feed — or maybe, listening to a podcast — you’ll find yourself checking for statements of knowledge, feelings or misinformation.
Resources
- Understanding and Addressing Misinformation About Science
- “Evaluating Information” website section
- Logical Fallacies: What You Should Know
Download a PDF version of this article.