Do Facts Matter?
Sure. Especially when they support our beliefs and lead to conclusions we like.
In 2017, Kellyanne Conway introduced America to "alternative facts" during a Meet the Press interview in defense of Sean Spicer's false statement about attendance at President Trump's inauguration. People who had seen the crowds on live TV or social media were perplexed by the White House’s obviously different account of what took place.
It was a moment that popularized the slogan “Facts Matter” for the left, which I would often see printed on buttons, t-shirts and tote bags at scientific conferences and events.
But the thing is, science isn’t a collection of “facts.” It’s not synonymous with “truth.” Science is a dynamic process that allows us to continue testing hypotheses and learning more about our world.
In the policy realm, facts don’t necessarily make decisions easier because they can be arranged in all sorts of different ways to support completely different versions of a situation (Sarewitz, 2004). Rather than provide clarity, they often inflame controversies and widen the partisan divide.
On Capitol Hill, consensus doesn’t require that information is correct or even understood. Members of Congress and their staff may even find that turning to experts for guidance doesn’t serve their political needs or interests. And even when they actively look for accurate data on a given topic, they’re likely to disagree with colleagues about what comprises expertise and credible evidence. After all, every DC think tank claims to deliver “the facts” in white papers and briefing materials, yet their conclusions rarely align.
So, with that in mind, let’s touch on a few concepts related to how people - legislators and the rest of us - make decisions.
Directional motivation describes the way we seek out evidence that supports our existing beliefs and identities. Even when we’re not consciously aware of it, how we go about collecting information, and determining its value, is influenced by the outcome we hope for. This isn’t necessarily intentional but occurs simply because we are subject to our own biases.
Motivated reasoning informs our judgements by influencing our emotions, assessments and behaviors related to a desired result. We can be motivated to be accurate or to arrive at desired conclusions. But which is more likely?
Behavioral scientists find that people most often arrive at the conclusions they want to by constructing reasonable justifications that seem valid. For example, when QAnon predictions repeatedly fail to manifest, adherents of the movement don’t usually abandon it, but rather accept ever more convoluted conspiracy theories to rationalize an increasingly bizarre set of beliefs.
Confirmation bias leads us to judge new information that aligns with our prior beliefs as more important than arguments that counter our worldview. It’s one reason why a vaccine hesitant individual might focus more on one news story about a rare side effect of the MMR vaccine than the enormous number of lives saved because of it. And a disconfirmation bias can emerge, where we push back against uncomfortable information that conflicts with our beliefs.
For these reasons and more, our decisions can appear to be rational even when they’re not. Regardless of whether we’re aware of it, we pay attention to the facts that support our existing beliefs and often arrive at conclusions we prefer.
In the end, yes, facts do matter. But only to a point. And unfortunately, facts aren’t enough, alone, to achieve informed decisions and evidence-based policies.
Druckman, James N., and Mary C. McGrath. 2019. “The Evidence for Motivated Reasoning in Climate Change Preference Formation.” Nature Climate Change 9 (2): 111–19.
Kunda, Z. 1990. “The Case for Motivated Reasoning.” Psychological Bulletin 108 (3): 480–98.
Morgan, M. Granger. 2014. “Use (and Abuse) of Expert Elicitation in Support of Decision Making for Public Policy.” Proceedings of the National Academy of Sciences of the United States of America 111 (20): 7176–84.
Sarewitz, Daniel. 2004. “How Science Makes Environmental Controversies Worse.” Environmental Science & Policy 7 (5): 385–403.
sometimes the smartest ppl of course are the best at rationalizing
"When Prophecy Fails" by Leon Festinger et. al.