When I turned that Critical Voter curriculum mentioned last time into a book [which PEL readers can get free from Amazon every Tuesday between now and July 19], the goal was to use election politics as a case study to teach critical thinking skills without playing favorites or picking winners. Even so, it’s hard to make logical arguments using real-world political examples that don’t come off sounding like predictions.
For instance, when writing about the phenomenon that is Donald Trump, I made the predication that The Donald’s lack of interest in creating an ethos bond with voters would ultimately doom his candidacy. Similarly, while covering the primaries, I sneered at political polling (or, more specifically, dinged the public for worshiping polls due to our tendency to treat quantitative information with more respect than it deserves).
First results from Iowa seemed to provide evidence to support these anti-Trump and anti-pollster arguments. But then the rest of the primary season, which involved Trump victories in alignment with what the polls were saying, provided contradictory evidence.
So, how should one deal with arguments, predictions, or beliefs that might be wrong?
One helpful approach can be distilled from work by The Critical Thinking Foundation, an organization that’s been supporting critical-thinking education for over 30 years. Their framework was designed for those unfamiliar with the philosophical tradition and includes, among other things, a set of “Valuable Intellectual Traits,” which describes qualities (both cognitive and non-cognitive) all critical thinkers should possess.
The first is intellectual humility, an understanding that even if what you believe is backed up by well-understood and vetted facts and held together with sound, logical reasoning, you can still be wrong. The world is a complicated place, after all. So even if the facts upon which your argument was based were true yesterday, that doesn’t mean they won’t be proven false today or tomorrow. Similarly, even the most artfully constructed lines of reasoning can (and often do) go wrong—or at least need to be adjusted based on changing circumstances.
In the case of my “Trump can never win” argument, I can always retreat to the fact that this argument said that Trump can never ultimately become President, which means I never claimed he would lose every race. But even if this safer argument stretches out the time to be ultimately proven right or wrong by several months, it is still worth thinking about the flaws in one’s own thinking, rather than continue to hold onto un-reflected beliefs that might be past their sell date.
At the same time, intellectual courage is another characteristic included in the Critical Thinking Foundation’s list of valuable traits. Their description of this virtue talks about the courage to challenge popular beliefs, which requires you to overcome negative emotions such as fear associated with nonconformity. But there are also negative emotions associated with holding on to your beliefs when they are challenged by anyone or anything (including facts that conflict with your theses), fears that can only be overcome by “sticking to your guns” and continuing to argue your case despite setbacks.
As mentioned when discussing intellectual humility, the world is a complicated place and the things worth arguing over (like politics) are complex activities that generate a constant stream of new data. So rather than just turn this way or that with each change of circumstance, one needs the strength to maintain one’s beliefs, especially if they were reached through sound research and careful reasoning.
So how is one to balance intellectual humility (which asks you to resist the urge to insist you are right, even when you might be wrong) and intellectual courage (which asks you to stick to your guns, even if your argument receives a setback)?
Fortunately, the philosophical tradition offers many potential answers to this question, some of my favorites being contained in an essay entitled "The Fixation of Belief" written by Charles Sanders Peirce, the father of Pragmatism (subjects covered by PEL on podcasts and Not School discussions available on the citizen site).
In that work, Peirce proposes that doubt motivates all of our thinking and that all of us constantly generate beliefs large and small in order to dispel the discomfort of doubt. With this premise in place, the author describes four ways those beliefs can become fixed in our minds.
One is an a priori method, which simply involves believing (or continuing to believe) things that make you comfortable. This fits well with the human tendency toward cognitive bias, which involves only accepting information that supports current beliefs while ignoring facts that conflict with preferred storylines.
Alternatively, one’s beliefs can be established by authority, such as a priesthood (secular or religious), which establishes what is allowed vs. forbidden to think within a society. Such authority is often challenged by free spirits, many of whom come to their beliefs through tenacity, which involves settling onto a belief system and boldly holding on to it at all costs (regardless of whether it is right or wrong).
While all three of these methods for fixing belief (a priori, authority, and tenacity) have something to recommend them, none are great bets as exclusive methods for getting to the truth. If that is your purpose, Peirce proposes science as a model, which treats beliefs as conditional, even as more and more experiments are performed and evidence amassed to asymptotically get us closer and closer to ideas likely to be true.
It is often pointed out (sometimes correctly) that the success of science has turned its pronouncements (particularly about highly complex topics) into arguments from authority. But putting that dilemma to the side, scientific norms that allow new information to challenge but not automatically overcome your own (ideally strong) arguments are not terrible models to embrace.
–Jonathan Haber is an educational researcher whose Degree of Freedom website describes his attempt to replicate (l)earning a BA in philosophy in one year. He is the author of MOOCS: The Essential Guide from MIT Press and Critical Voter: How to Use the Next Election to Make Yourself and Your Kids Smarter. He is currently helping to build a new graduate school of education.
I would agree with, “Peirce proposes that doubt motivates all of our thinking” if it was change to: doubt motivates all or our critical thinking. I don’t think doubt is the motivation necessarily for design: which can involve lots of thinking.