Getting back to group polarization and the problems of deliberation. Following on from the original essay in my New York Times column, I posted again on the topic a few weeks ago, exploring recent experiments by Cass Sunstein of the University of Chicago and colleagues on the phenomenon of group polarization. They found -- and a wealth of other evidence also supports this view -- that when people are brought together into discussion, the group often ends up coming to a consensus that is more extreme than the original views of the people making it up. You can read this paper here; it's due out soon on the California Law Review.
In an earlier paper in the Yale Law Journal, Deliberative Trouble? Why Groups Go to Extremes, Sunstein mentions, for example, a classic study in the 1960s that found what is called "the risky shift." The experiments had graduate students in management respond to questions involving attitudes toward risk. After answering, the students then engaged in open deliberation on the questions. The experiment found that the students in their deliberations almost always moved toward expressing more aggressive attitudes towards taking risks, stronger than they had privately.
By the way, if you have some time, read the Sunstein paper which talks about a number of similar examples. It's both illuminating and a great deal of fun to read.
But there is obviously a lot more to what happens in deliberation that a tendency toward polarization. The Sunstein et al. experiments show that groups of people who are relatively similar in their views to begin with -- say, a group of Democrats, or a group of Republicans -- after deliberation, come to a greater consensus (as well as becoming more extreme and, one might say, convinced in the rightness of their views). But what if people aren't similar? What if Democrats and Republicans try to discuss the issues together?
In a comment to that post, John Savage wrote that...
I’m unsure why you cite the Sunstein study as a test of the hypothesis that “deliberation days” would help to break down differences, when the experiment actually involved intentionally segregating liberals and conservatives, with the unremarkable result that both groups became more extreme.
At the end, you come to an optimistic conclusion from a centrist point of view: that somehow bringing liberals and conservatives together in a common forum would quickly break down their differences. As a political blogger, I just find this intuitively wrong.... two groups of “extremists” on opposite sides would probably not moderate each other’s views by talking to each other, no matter how many “links” you tried to provide between them.
Indeed, I didn't mean to imply that the Sunstein et al. study supports the idea that deliberation days would break down differences. What it offers is more of a cautionary lesson -- that if you're going to have deliberation days, you'd better try to get lots of diversity in your groups, because if you don't, you may well make the polarization even bigger. Logically speaking, the study doesn't say *anything* about what you're likely to see in groups where you *do* have lots of diversity and polarization. It just doesn't address that situation at all.
Since then I've been wondering about this. Cass in an email offered some views on what tends to happen in groups that do have great diversity, say both staunch Republicans and Democrats. As we all know from wandering in the poltical regions of the blogosphere, you tend to find persisting if not heightened polarization. Cass suggests that this tends to be the outcome whenever people clearly recognize and identify themselves with some group, such as a political party. The "identity differences" make it difficult for either group really to listen and take the points of the other side seriously.
I've come across some other fascinating work that also touches on this point. For example, in a nice paper entitled Modeling Cultural Cognition, Dan M. Kahan, Donald Braman, and James Grimmelmann (of Yale, George Washington and New York Law Schools respectively) argue that people who strongly hold different opinions -- on an issue such as gun control, for example -- often do so largely for powerful cultural reasons that make them relatively impervious to persuasion. As they put it, "culture is prior to facts in individual cognition."
The idea is that for a number of fairly basic psychological mechanisms, people tend to hold the same beliefs as most people in their own cultural group (Republicans, economists, academic physicists, etc.). And these views do not easily get swayed in the light of new evidence. Rather, "the beliefs so formed operate as an evidentiary filter, inducing individuals to dismiss any contrary evidence as unreliable, particularly when that evidence is proffered by individuals of an opposing cultural affiliation."
Now I think we can all relate to that observation. I can present all the new climate studies I want, published in Nature, Science, wherever, to my determined climate-skeptic friends, and even if they listen politely (rolling their eyes occasionally), I know they just don't accept it. It doesn't sink in as new, legitimate information. And from my side, no new "study" on climate change written up by anyone from the American Enterprise Institute is likely to change my mind (see, I couldn't even help but put quotes around the word "study"!).
A more important question, of course, is how it might be possible to design strategies to get past these entrenched cultural differences that make it virtually impossible for people to come to agreement, even in the context of full information. You need to devise strategies for deliberation so that people can take on new information without having their cultural certainty threatened (at least not too rapidly). Seems like a tall order.
But I did hear recently of some researchers in Belgium, coming out of ecology, I think, who were devising just this sort of strategy, and using it, apparently effectively, in practice, bringing together business people with environmental activists to work together on land use issues, for example. I'll post on this soon if I can find out more about it. What I remember of the idea is that they bring the polarized parties together and first have them discuss all those things on which they AGREE, which usually are quite a lot. They establish common ground, and at least a little bit of trust, and then gradually take small steps away from that.