Monday, April 26, 2010

Measuring Epistemic Closure

It’s no accident that recent talk about “epistemic closure” takes the form of an accusation. It’s a way of saying that other people are close-minded. “Epistemic closure” or “close-mindedness” aren’t terms that lend themselves to self-reference. That it occurs to somebody to ask whether she’s suffering from that condition herself is prima facie evidence that she isn’t. Being self-consciously closed-minded is a logical impossibility. Wishful thinking about our own open-mindedness, however, is an all-too-human reality.

That makes it a particularly adroit rhetorical move on the part of young conservatives to affect a high-minded concern with epistemic closure in the conservative movement. Ostentatiously worrying about your ideological comrades’ closed-mindedness makes a self-aggrandizing spectacle of your own open-mindedness. It’s all for show, of course, unless you’re applying the same metric for measuring epistemic closure to yourself that you’re applying to other people. I haven’t yet seen any evidence that either the conservatives or liberals deploring conservative epistemic closure are doing anything of the sort.

Identifying a single standard of open-mindedness that can be applied across opposing ideological communities isn’t easy. Consider how Ezra Klein stumbles when he makes a typically conscientious stab at it:
“[W]e'd all agree that it's certainly theoretically possible for partisans of one party to embed themselves inside an echo chamber and become systematically more hostile to outside evidence than partisans of the other party. And given that this country has only two serious political parties, that would clearly be a troubling state of affairs. So the relevance of this discussion and the potential need to have it are not, I imagine, in doubt. The question is how do you measure epistemic closure?

"The easy answer is you test for its product: Misinformation. What you'd want to do, I guess, is continuously poll a standard set of questions based on empirical facts. "Has GDP grown since President X's inauguration?" "Have global temperatures been rising or falling in recent decades?" "Does the United States have longer life expectancy than other developed nations?" "Do a majority of Americans approve of the president's job performance?" That sort of thing. Have representatives of both parties decide the questions and then see whether respondents from one party or the other get more questions right.”
This sounds like a fair test of comparative epistemic closure--until you start thinking about what questions would accurately measure the extent to which people are comparatively (mis)informed. Even when we’re on our best deliberative behavior, our perception of facts interacts with our ideological commitments. Take Klein’s question: “[d]oes the United States have a longer life expectancy than other developed nations?” That’s a fact close to his heart because it’s liberals’ favorite measure of the output of a health system relative to the imputed dollars of medical spending. By that measure, under our (pre-ObamaCare) health system, we spend a lot more money than other developed countries to get marginally worse health outcomes.

If you’re a well-informed conservative, however, you’re not nearly as likely to keep track of life-expectancy data because you've already decided that it's a poor measure of healthcare outcomes. You know that we Americans have lower life-expectancies than people from other comparably developed political economies because, inter alia, we eat more fast food, murder each other more prolifically and kill ourselves in car accidents more frequently than people in other developed countries. But survival rates from various forms of cancer are likely to stick in your mind because they suggest a respect in which our healthcare system generates substantially better outcomes than the competition.

It’s reasonable to say that, in each case, liberals’ retention of the former fact and conservative’s retention of the latter is an example of confirmation bias, i.e., our tendency to be more attentive to facts that corroborate our pet theories than those that falsify them. That, however, is a species of irrationality characteristic of well-informed people who’ve taken the trouble to have theories to confirm.

When conservatives talk about the liberal echo chamber, they’re thinking about people who get most of their news from sources like NPR; liberals’ favorite example of the conservative echo chamber is the influence exerted by Rush Limbaugh. Yet, according to the Pew Research Center, both of those audiences are pretty equally informed about politics and significantly better-informed than the general population. I suppose you could test for liberals’ and conservatives’ retention of political facts that aren’t pertinent to the assessment of either side’s ideological theories.  But why would anybody bother to remember them?

Klein’s mistake, I submit, is that he’s thinking of close-mindedness as an attribute of individuals rather than of ideological communities. I’ve mentioned one measure of collective epistemic closure here, viz., "ideological victories." I’ll try to describe some others in due course.

No comments: