[This is Part 4 of a 4-part series. Part 1. Part 2. Part 3.]
In my first three pieces I’ve spent an awful lot of time talking about epistemic humility. Now I’m going to talk about what I consider to be the antithesis of epistemic humility: extremism.
My definition of the term is non-standard, but I believe it both fits as the antithesis of epistemic humility and matches our intuition that there’s something to extremism that is more than merely being far removed from the mainstream. After all, if you live in a society where child sacrifice is the norm and you consider it an abomination you are an extremist in a technical sense, but that lacks the pejorative connotations that we typically bundle in with the term.
People believe things for a lot of reasons, but the very last reason is that they have gone through some kind of rational evaluation of the evidence and logic and concluded that the belief in question is likely to be true. It is my experience that, almost without exception, our explanation of our positions (political, religious, aesthetic, etc.) are not explanations, but post hoc rationalizations.
I initially came to this conclusion as an undergraduate philosophy major (and promptly switched my major to math), but the really exciting work is coming out of behavioral economics where, as Bryan Caplan argues in The Myth of the Rational Voter, people are understood to have preferences over beliefs. Simply put, this means that we would like for some things to be true rather than others. For example, we would generally like to believe nice things about ourselves. I think that probably the most important, pervasive, and powerful source of preferences over beliefs has to do with our desire to maintain social bonds, however. We use beliefs as a way to signal our in-status to our peers.
Along with this desire to believe certain things, however, there is also a desire to be right. In “Paying for Confidence: An Experimental Study of Preferences over Beliefs”, researchers demonstrated that people are willing to pay for non-instrumental information that will increase their probability of being right without actually influencing their decision. They explain the setup this way:
Would you prefer to make a decision that has a 90% chance of being correct rather than a 60% chance? Certainly most people would say yes. If this is so, would you be willing to pay for information that might raise your prior belief from 0:60 to 0:90? While this information is non-instrumental to your decision – you would be making the same decision under both posteriors – it might make you feel better about your decision.
Experiments verified that people will indeed pay for this information, and that “a person would pay for such information because he values having a high posterior at the time of making a decision, or put differently, ‘he wants to feel confident in his decision’.” Taken in conjunction, the combination of preferences over beliefs and a willingness to pay for non-instrumental information tell a simple story: we seek to feel certain rather than to be correct.
I believe that is the essence of epistemic pride, the antithesis of epistemic humility, and the true root of extremism. It is also, in a very real sense, a counterfeit salvation. After all, don’t we believe that the kind of rest we find in the next world is not a lack of work, but a freedom from the hard slog of fighting temptation? Well, this false certainty offers a reasonable facsimile of freedom from temptation but leaves us plenty of opportunity to be anxiously engaged in “good” works.
To give in to extremism, by which I mean to adopt epistemic closure around our irrationally chosen beliefs, is to “call evil good, and good evil” and to “put darkness for light, and light for darkness” (as Isaiah would say) or to “exchange a walk on part in the war for a lead role in a cage” (as Roger Waters would say).
So how might this play out in the real world? First, you adopt a position, probably not just one belief but rather an entire bundle of related political and aesthetic viewpoints. Congrats: you have found your imitation Zion with its counterfeit unity. The next step, having found your solace, is to protect it by creating defenses around your certainty in your beliefs. How might that be accomplished? Well, you want to find evidence that supports your belief and avoid evidence that might question it. With social networking, partisan news outlets, and some judicious application of RSS subscriptions it is easy enough to get a good start on that, but the primary risk comes from those outside your chosen sphere. Especially the smart, reasonable ones. If you engage with them in dialogue, 2-way communication, your beliefs will be horrifically exposed to new information or critique, and so the most effective way to insulate your beliefs is to refuse to listen to them. This is usually accomplished by taking it as axiomatic that they are all stupid / evil, although more subtle renditions generally include the idea that anyone who appears to be smart and reasonable is an illegitimate anomaly who can be filed away as an irrelevant exception.
So far so good, but trying to maintain a positive belief that millions of your fellow citizens are some combination of horribly ignorant and malignant requires maintenance. Luckily, you find that your opponents can actually serve your needs here. The trick is to find the most easily rebutted of them, take their views, and then deride them with your friends. It’s very important that this be a group bonding activity since, after all, that was the starting point for the whole exercise. By cherry picking bad arguments you not only maintain your assumption that the other guys are all dumb jerks, but you also inoculate yourself to their alternative viewpoints in precisely the same way a vaccination works in the real life. And of course it is easy to justify this by claiming that the most vile examples of your opponents are those which cry out most desperately for some kind of redress. After all: someone on the internet is wrong.
I trust the pattern I’ve outlined is familiar. If it isn’t: check your Facebook news feed.
I’ve already highlighted the fact that this entire pattern of behaviors is patterned quite closely as a cheap knock off of the genuine quest for Zion and salvation. Here is something else interesting to note: the extremists on either side of an issue (take your pick) give the appearance of fighting (based on the words) but are in fact cooperating (based on their actions). “Cooperate” might be too dignified a word. Perhaps terms like “enabling” and “co-dependent” are more appropriate, but the key fact is that the extremists are feeding off of and supporting each other in a perpetual feedback loop of outrage.
So this is where I will tie back into the idea of the two churches from my second post and cite one of the insights from my third post: the priority once again of deed over word. I will do this by first bringing in Hugh Nibley again, however, once more from “The Prophetic Book of Mormon”.
At the center of ancient American studies today lies that overriding question, “Why did the major civilizations collapse so suddenly, so completely, and so mysteriously?” The answer… is that society as a whole suffered a process of polarization into two separate and opposing ways of life… The polarizing syndrome is a habit of thought and action that operates at all levels, from family feuds like Lehi’s to the battle of galaxies. It is the pervasive polarization described in the Book of Mormon and sources from other cultures which I wish now to discuss briefly, ever bearing in mind that the Book of Mormon account is addressed to future generations, not to “harrow up their souls,” but to tell them how to get out of the type of dire impasse which it describes.
What Nibley observed about the polarity described in the final days of the conflict between the Nephites and the Lamanites was that it was a false polarity. With their words they drew passionate distinctions, but in their actions they closely mimicked each other. At every point in their downward spiral they mirrored each other and strengthened each other’s resolve, each taking turns outraging and offending the other. It is true that a single aggressor is enough to create the necessity of defensive action, but that’s not what is depicted as an endless cycle of revenge required ever more massive commitment from either side leading to a culmination at Cumorah, a culmination that was arranged by the two parties. Mormon sends a letter to the Lamanite king, the king agrees, and so they march off to the appointed place at the appointed time to finish together the terrible, awful dance they had begun so long ago. The war had been lost long, long before Mormon arrayed his surviving people for their final battle.
So what does the real polarity look like? I suggest the example of the two churches. The extremists are struggling with rather than against each other. They are unified in their dedication to sowing contention, which comes from the devil. Thus: the Church of the Devil.
The other group does not struggle against the extremists, for to combat them is to become them. Nibley calls this “broken symmetry” out explicitly when he cites Moroni 7:11-13:
A man being a servant of the devil cannot follow Christ; and if he follow Christ he cannot be a servant of the devil. Wherefore, all things which are good cometh of God; and that which is evil cometh of the devil; for the devil is an enemy to God, and fighteth against him continually, and inviteth and enticeth to sin, and to do that which is evil continually. But behold, that which is of God inviteth to do good continually; wherefore every good thing which inviteth and enticeth to do good, and to love God, and to serve him, is inspired of God.
The devil fights against God, but God–although in opposition–does not join with the devil even in combat. Just as in Nephi’s vision the Church of the Devil fights against the Church of the Lamb of God, but nowhere does the Church of the Lamb of God retaliate or seek or revenge. I’m going to wrap this piece up now because there are too many questions to address without making this post overly long. Instead, I’ll outline some of the questions that I’ll cover in the next piece (which may or may not be the last of my guest-stint, I”m not sure if I’ve got one or two left in me).
1. What are real-world examples of this extremism?
2. If you think you’re not an extremist, doesn’t that mean that you probably are, via epistemic pride?
3. How does one oppose extremism without opposing extremists?
4. Given that the non-extremists can be drawn from across the spectrum of the beliefs and the extremists at least have some unity of belief, how can non-extremists work towards unity?
Thanks again for all the great comments, and I’m looking forward to finishing off this series of posts.
In a way, Fox News vs Huffington Post is a good example. They both devote a good portion of their stories to exclaiming how stupefyingly wrong the other is.
I loved reading your first three articles but, to be frank, I easily got lost with the level of intelligence required to wrap my head around it all. THIS article has a beautiful clarity and applicability that really does a great job blending in the concepts laid out previously. I wish I had more insight or some mind-twisting question, but honestly I’m just looking forward to your next piece :)
Fantastic job. I’m grateful for the gentle chastisement I received through reading this post. I look forward to the next one.
Great point about how the Devil fights Christ, but Christ does not engage with the Devil. A couple of scriptures come to mind:
“He that takes the sword shall perish by the sword….”
“He that digs a pit for his neighbor shall fall into his own pit…”
“My kingdom is not of this world, if my kingdom were of this world, then would my servants fight…”
The great work of the extremist is the crusade: to carry a banner, march under a flag, to carry their cause onto glory and triumph. Onward Christian soldiers!
But this was decidedly not the way of Christ. Thanks for the great post.
Great stuff — this post as well as your others. Just a couple of thoughts.
First, on your question 2, I vote “no”. I do think that, probably, if you’re an extremist you’ll think you’re not. But I suspect that if you’re not an extremist you’ll also think you’re not, simply because you recognize your own lack of confidence on the relevant issues. However, if you think you’re immune to extremism, you probably are an extremist, because (thanks to how we’re wired) to stay humble we have to watch out for the seductive ways of the false Zion.
Second, I think you’ve made a strong case for epistemic humility as a virtue. I agree, but I wonder what you think about a counterargument. It goes like this: epistemic humility leads to paralysis. Like the Pyrrhonian skeptic, the epistemically humble lacks the confidence to make any major difference in the world. If the reformers and protesters of the past had lacked conviction, if they had been epistemically open to their causes being misguided, only landed white men would be allowed to vote. Maybe that’s hyperbolic, but you get the point: the epistemically humble end up as fence-sitters, unable to ally themselves with a cause and so unable to help bring about morally important large-scale results. So epistemic humility isn’t a virtue after all; it is a vice allied with moral cowardice.
Maybe you’re planning to address this issue in a later post; if so, apologies and I’ll happily wait. If not, I’d love to hear what you think about it.
Well, if both my real life ego and facebook ego haven’t been uncovered from their dark corners with a spotlight I don’t know what else would…
This is fantastic. I’ve read this like three times so I can internalize it and reverberate it back to others. Obviously I’ll claim it as my own ;)
Nathaniel, it seems to me that your post sinanizes (what’s the moral equivalent of criminalize?) apologetics, at least as they have been so often practiced by FAIR and similar. Euthyphronics also points out the cause of so much underlying intellectual malaise I see in spiritually serious people. If you’re always cognizant that you might be wrong, short of pure revelation, you are required to act only in proportion to your confidence lest you do unjustifiable harm.
I think this counter-argument is over-stated. First of all, fence-sitting is not a genuinely viable alternative. I haven’t managed to work this one in yet, but I like both the quote from Sartre (“Men is condemned to be free”) and also the drummer and lyricist of the progressive rock band Rush: “If you choose not to decide you still have made a choice.” Fence sitting is not an abstention from choice. It is a choice.
So I strenuously disagree that epistemic humility–properly understood–leads to paralysis.
It’s also important to keep in mind that epistemic humility doesn’t imply a lack of conviction. There’s a qualitative difference between believing with a high degree of certainty (but also a genuine possibility of doubt) and believing with absolute certainty.
If you look at the behaviors I discuss, the problem isn’t even so much about degree of certainty as about the source of conviction. Someone who is epistemically humble attempts to derive their conviction from the evidence. Someone who is an extremist employs strategies that create a perception of certainty without actually generating new information upon which to base that certainty.
In short: I don’t think there’s anything about epistemic humility that implies either paralysis or lack of conviction. It’s merely a different kind of conviction, and a more authentic variety because it’s constantly self-critiquing.
I’ll have some more thoughts as I respond to Brian.
I think your response has the same crucial defects as Eurythmics, but I wanted to address it directly. Epistemic humility does not preclude authentic conviction. It attacks manufactured conviction. Manufactured conviction is what you get when you engage in the cognitive biases I have outlined: selection bias, confirmation bias, demonizing your opponents, etc.
Unless you think that apologetics is chiefly comprised of these tricks, there’s no reason to believe that it will be significantly impacted by an attempt to move towards epistemic humility.
I think that in practice a person who practices epistemic humility will be far more selective about their areas of true conviction, but I also think that’s a very good thing. As a general rule, we suffer from a super-abundance of conviction that is unjustified rather than from a lack of authentic conviction.
For example, rather than picking up an entire suite of beliefs affiliated with a political party (not all of which can plausibly be understood well enough to generate authentic conviction), someone who is epistemically humble will attempt to carefully differentiate between issues they have truly studied and those they have not.
Here is one final thought, which probably could be made into its own post: so far we’ve been talking only about individuals in isolation, but the social view can also shed some important light on this issue.
If every person merely remains silent until they can be sure, no one talks, no one learns, and nothing happens. If everyone espouses the truth as they see it, but in an atmosphere of tolerance and genuine communication, than the individual flaws of various arguments and points of view can be subjected to community review.
In other words, even if you think that a lot of your views might be wrong, it makes all the sense in the world to get out there and defend them to the best of your abilities and let the chips fall where they may. Rather than a bunch of mute, paralyzed, unsure and timid folks, I envision a society of the epistemically humble would have far more productive and useful and passionate discourse than we see today.
The key thing I’m trying to convey is that it’s not about what you believe or even necessarily how strongly you believe it. It’s about your reasons for belief.
Cognitive bias or reason? That’s the choice we have to make.
Nathaniel, thanks for the response. However, all conviction is manufactured, constructed in a Piagetian sense just as any other sensation. In the OP, you argue that people provide post-hoc rationalizations for beliefs. This can be true, but it does not mean there is no internal reasoning behind the belief (unless one wants to argue that beliefs are randomly generated), but that they are constrained in their response by social norms, lack of vocabulary, lack of incentive to reveal openly, etc. T. F. Green created a very useful model of beliefs wherein beliefs are arranged like mathematical propositions in a pseudo-logical order, each supported by other beliefs except for a few axiomatic beliefs for which no justification could be given. I bring this up because your description of epistemic humility reminds me of his argument that an ideal belief system would have, among other things, a minimal number of axiomatic beliefs. In any case, I argue that a person with genuine capacity for doubt will be constrained to act less intensely than a true believer because of the potential for harm wrong actions bring, especially when those actions carry long-term impact (good or ill) and commitment. It’s essentially an expected value problem taking into account the perceived probabilities of being right. Example: Let’s say I want to join Teach for America because I believe they do good as an organization. However, I recognize that sytematized education is a complex entity where our collective intuition has often proven wrong before, so it’s possible that TFA does more harm than good (via lack of pedagogical training, the priviliging of white ivy-league culture, or many other possible mechanisms). The fact that TFA requires only a 2 year committment means I’m much more comfortable joining than if they required a 10 year committment, calculating that 2 years is a small enough wager given my understanding of the whole. i.e., you don’t gamble the whole pot on three of a kind, but you would with a royal flush.
I think I should have started this exchange by asking questions first to better understand your position before reiterating my own. It’s a bad habit, and I’ll try to rectify that now with two key questions.
1. Is it your understanding that people are fundamentally rational in their beliefs (e.g. they believe propositions to be true or false by objectively evaluating the relevant information), and that on top of this fundamentally rational mechanism of belief there is then a distorting layer of “social norms, lack of vocabulary, lack of incentive to reveal openly” which may obscure but cannot replace the underlying mechanism?
2. In regards to “a person with genuine capacity for doubt will be constrained to act less intensely than a true believer because of the potential for harm wrong actions bring”, how does this change when we consider that the consequence of inaction is, itself, harmful? And how to we consider the pro/con of conviction as opposed to the pro/con of flexibility of belief and capacity to learn?
1) Yes, but with the very large caveat that each individual’s capacity for rationality is variable. Their mental development , experience, education, genetics, diet, and many, many more variables (most outside our control or recognition) can affect their ability to reason. A clearer, if less informative, statement is that I believe all people act in the way that makes the most sense to them. Some of us are poorer at making sense than others, but that’s what drives our decisions.
2) Inaction is, in essence, the default state (maximal entropy, equilibrium, Newton’s laws, whatever model you want to use), so it needs less defense. We are frequently motivated to act and one can choose to act in the form of inaction, but that’s a different response then inaction due to motivations that conflict. As to your last question, are you proposing conviction as independent from flexibility of belief? Decision-making is a multi-dimensional process, but I don’t think I’m understanding what you are attempting to communicate by that question. Can you elaborate, please?
Re inaction and paralysis: I’m with Brian @ 11 here. My original worry wasn’t about inaction in general, but in certain high-risk situations where there doesn’t seem to be decisive evidence that the epistemically humble can appeal to. A relatively timely case: If it turns out that homosexual activity is sinful, then it would be a Very Bad Thing for society to implicitly condone it by recognizing homosexual marriages. If it is not, then it would be a Very Bad Thing to cause homosexual individuals pain and sorrow by denying such marriages. It seems that whether or not the activity is sinful isn’t straightforwardly settle-able by appeal to some neutral body of evidence. (What even counts as evidence for this claim?) Doing nothing doesn’t actively contribute to either (possibly) harmful outcome; stumping one way or the other on the issue brings a significant risk of contributing to a Very Bad Thing.
Also: I’m starting to lose the plot around 10. You end with
Cognitive bias or reason? That’s the choice we have to make.
But this strikes me as an odd way to frame it. It’s not as though anyone chooses cognitive bias; the most epistemically proud are certain they’re simply rationally responding to the evidence. The only way (as far as I can tell) to opt out of cognitive bias is to vigilantly scrutinize all your beliefs to skeptical probing. But this method runs into conflict with e.g. the claim:
In other words, even if you think that a lot of your views might be wrong, it makes all the sense in the world to get out there and defend them to the best of your abilities and let the chips fall where they may.
The process of “defending your views to the best of your ability” is precisely the sort of thing that leads to cognitive bias: to defend my views I have to look for reasons why they’re right, which aids confirmation bias. (Even if I try to defend my views by looking for objections to defuse, I’m looking for objections to see why they’re wrong, which still tends towards bias.)
I agree with you on a lot of points: epistemic humility need not lead to silence, would greatly improve our public discourse, and is a virtue. (I’m worried about my objection at 5, but not convinced by it.) But you seem more optimistic than I do that the epistemically humble can be very confident in a lot of things. And I’m not sure where that confidence is coming from.