Most of us like to believe that our views are correct and opposite views are wrong. Of course, sometimes our views are indeed valid, but especially in political contexts, nonpartisan fact-checks and discoveries of disinformation campaigns reveal more inaccuracies in our views than we’d like to admit.
Believing that we’re right is not just soothing to our egos (or protective of egos that are very fragile). It’s also about consistency and what seems like common sense. Why would you believe something if it were untrue, right? How easy the world would be to navigate if everything you believed was how it really is. This safe feeling is part of what’s called naïve realism (Gilovich and Ross, 2015).
It’s inconsistent, and so cognitively uncomfortable, to believe in, march for, vote for, or donate money to a cause that is undermined or contradicted by factual information. It may also be uncomfortable if what you believe or support carries negative consequences for innocent people, especially when they include you and your own family. The discomfort as you begin to sense your mistaken beliefs or their negative consequences falls under cognitive dissonance,
How Do We Deal with Dissonance?
Social psychologists have documented numerous ways to reduce cognitive dissonance (Stalder, 2020). Logically, you can change your view moving forward or take action to address your mistake (Stone and Fernandez, 2008). For example, many who were against masks or vaccines changed their minds after they or a loved one caught COVID-19, including former governor Chris Christie (Miller, 2020).
Voting against gun-control legislation doesn’t jibe well with the increase in mass shootings, leading at least one Republican representative in the House to take a “new stance” at the cost of his career (Richards, 2022)—not that there’s only one way to reduce mass shootings. Former New York Governor Andrew Cuomo acknowledged that he “made mistakes” when he eventually resigned last year (Villeneuve, 2021).
Unfortunately, it’s more common to reduce dissonance by taking less rational approaches. If news agencies, scientists, or court rulings provide new information that invalidates or questions your view, you can attack those messengers as “liars” or “partisans” or call them other names; you can rationalize other reasons to doubt the new information; you can misconstrue or misremember the new information as less damning than it actually is (even calling a mass shooting a false-flag operation); you can avoid exposure to the new information by not opening that newspaper or tuning to that channel; or you can retreat to your echo chamber in other news sources or social media sites that support your view. These steps make it easier to believe you’re still right, although they can also be about saving face despite knowing you’re wrong.
There are many biases behind this natural tendency of thinking that you or your group is right and acting to quiet any doubts. Besides naive realism, there’s self-serving bias, group-serving bias, group-centrism, groupthink, bias blind spot, belief perseverance, political tribalismand especially confirmation bias,
A New Level of Confirmation Bias?
Confirmation bias can be conscious or less-than-conscious. You avoid people or information that contradicts your view while you seek out or embrace anything that supports your view (Gilovich and Ross, 2015). It’s as easy as a button press on your remote control. It’s understandable, though irrational.
But when those who control the cable-show flow of information promote helpful information to their loyal viewers and exclude any contradictory information, however newsworthy, it goes beyond your decision. The decision is being made for you for reasons beyond your own.
The recent decision by Fox News not to air the January 6 hearings is arguably an obvious case of this higher level of confirmation bias. Although some Fox News viewers may certainly appreciate not being exposed to the hearings at all, they cannot fully realize what they’re missing, and some viewers may not be averse to watching parts of the hearings.
Fox News went even further to include counterprogramming more consistent with their previous narratives about January 6, at least some parts of which are questionable if not false (McCarthy, 2022). Yes, of course, the Fox News viewer can switch channels during a commercial to see what’s happing at the hearings—but one Fox News host even went commercial-free during that first night, a decision possibly intended to keep viewers from switching (Back, 2022).
These strategies can have multiple purposes or effects, possibly including some face-saving for Fox News hosts and leadership and helping to keep certain politicians in office. It’s true that cable shows cannot cover every story, and any show’s regular decisions on what to air may reflect some form of a selection bias. I also acknowledge that not all “new information” is equally valid, so some decisions not to air are clearly justified. But the January 6 hearings seem too big in scope and significance and, even if not 100 percent accurate, too evidence-based to be compared to the relatively less important daily coverage decisions.
Ultimately, of course, you can choose what channel to watch. And you know the basis for your views better than I or other authors. At a minimum, if you don’t already do this, I encourage everyone to slow down their thinking when new information surfaces (Gilovich and Ross, 2015; Stalder, 2018). Try to be open to multiple sources. Watch a half-hour or a few minutes of another source. What’s the real harm besides the time you’d spend? You don’t have to change your views just because you’re tuning in, and you can leave whenever you want if it becomes emotionally difficult.
But if you do slightly change a view upon learning new information, especially if it comes from multiple sources, there’s nothing weak or hypocritical about that—the information is, after all, “new.” You didn’t have it before, and now you do.