Are Americans Misperceiving or just Making Guesses?

In the APSA Public Scholarship Program, graduate students in political science produce summaries of new research in the American Political Science Review. This piece, written by Leann Mclaren, covers the new article by Matthew H. Graham, Temple University, “Measuring Misperceptions?”.

With politicians like Kari Lake refusing to accept Arizona’s midterm election results, even though the Associated Press has called the race in her opponent’s favor (with 99% reporting) the topic of misperception, or willingness to hold onto claims that have been proven false, has been a topic of increased interest.

Matthew Graham in his latest work in the APSR finds that capturing the extent to which misperceptions exist in the electorate, as well as what these people’s beliefs are, may be trickier than we think. In the article Graham investigates different methods in discerning between people who are just ignorant and lack political knowledge (or just randomly guessing), and people who willfully hold onto beliefs that are factually incorrect. Graham ends up concluding that most people in his studies were simply ignorant and/or lacked confidence in their answers on questions concerning popular controversies. This may point to a reconsideration on both how many people are actually holding onto factually incorrect rhetoric, and ways in which researchers can capture and analyze these beliefs in the American public.

Graham’s motivation for approaching this study is due to a disconnect between canonical research on survey responses and prevailing methods for measuring misperceptions. Researchers often use online questionnaires or surveys that ask people their opinions on factual matters, ranging from climate change to the number of COVID-19 deaths reported on the news. If the researcher knows of false claims about the topic, they assume that respondents who answer incorrectly have heard and accepted these claims as well. Graham approaches his study in a different way by arguing that we can use data to distinguish between people who lack knowledge from those who have been misinformed. In his article he designs and executes four different studies that ask respondents their opinions and how confident they are in these beliefs, then observing the persistence of these claims over time.

His first study was on the issue of foreign aid. In this study he asked a pool of respondents in an online survey in August and then in September 2018 whether they thought the government spends more on foreign aid or three other policy areas (Medicare, National Defense, and Social Security). He followed up on this question in both periods, asking respondents how sure or certain they were that their answer is correct. In his second study, he conducts two separate surveys, in June 2019 and June 2020, and March and August 2020 respectively. This survey was on six political controversies (the Clinton email scandal, the Robert Mueller probe, Obama’s comments on DACA, Obama’s birth certificate, Trump’s comment “grab them,” and Trump’s Article II comments). His third study applied a similar framework to the issues of COVID-19 and other science-based conspiracy theories. All three of these studies asked respondents how sure they were about their answer in two different time periods. His fourth study attempted to provide a training exercise as an additional way to spot respondents who are consistently reporting incorrect answers.

“Graham’s results suggest that the research community is substantially overstating the proportion of Americans who hold false beliefs.” Ultimately when Graham ran his studies and gathered his results, he generally finds that people who answer incorrectly are very inconsistent with their answers. For issues such as climate change and continental drift, he found that respondents’ incorrect answers were the most inconsistent, and not much different from random guessing. False beliefs on other issues are more stable over time (e.g., that vaccines cause autism), but are no more stable than false beliefs about uncontroversial topics that have not been the subject of false claims in the public sphere (e.g., that lasers work by focusing sound waves). When people provide the correct answer, they have more consistent responses and reported confidence in their answers. This suggests that whereas people who say that they are certain of the correct answer really know it, people who say they are certain of the wrong answer hold much weaker beliefs.

Graham’s results suggest that the research community is substantially overstating the proportion of Americans who hold false beliefs. Partisan belief differences are more likely to result from different degrees of ignorance than from acceptance of false claims.


  • Leann Mclaren is a Ph.D. candidate at Duke University where she studies American Politics, with a focus on Race, Ethnicity, and Politics. She is a National Science Foundation Graduate Research Fellowship recipient (NSF-GRFP) and an APSA Minority Fellowship Program recipient. Leann’s dissertation explores how Black immigrant candidates navigate identity in political campaigns. Her other projects include mapping Black political behavior generally, specifically in the realms of social movements, and political participation. Leann holds a B.A. from the University of Connecticut and was an APSA Ralph Bunche Summer Institute Scholar.
  • Article details: GRAHAM, MATTHEW H. 2022. “Measuring Misperceptions?”, American Political Science Review, 1-23
  • About the APSA Public Scholarship Program.