Is Reasoning Built for Winning Arguments, Rather Than Finding Truth?

The Intersection
By Chris Mooney
Apr 25, 2011 5:38 PMJun 28, 2023 7:07 PM

Newsletter

Sign up for our email newsletter for the latest science news
 

How is this for timing? Just as my Mother Jones piece on motivated reasoning came out, the journal Behavioral and Brain Sciencesdevoted an entire issue to the case for an "argumentative theory" of reason, advanced by Hugo Mercier of the University of Pennsylvania and Dan Sperber of the Jean Nicod Institute in Paris. You can't get the article over there without a subscription, but it's also available at SSRN, and here is the abstract:

Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought. Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade. Reasoning so conceived is adaptive given the exceptional dependence of humans on communication and their vulnerability to misinformation. A wide range of evidence in the psychology of reasoning and decision making can be reinterpreted and better explained in the light of this hypothesis. Poor performance in standard reasoning tasks is explained by the lack of argumentative context. When the same problems are placed in a proper argumentative setting, people turn out to be skilled arguers. Skilled arguers, however, are not after the truth but after arguments supporting their views. This explains the notorious confirmation bias. This bias is apparent not only when people are actually arguing, but also when they are reasoning proactively from the perspective of having to defend their opinions. Reasoning so motivated can distort evaluations and attitudes and allow erroneous beliefs to persist. Proactively used reasoning also favors decisions that are easy to justify but not necessarily better. In all these instances traditionally described as failures or flaws, reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found.

Behavioral and Brain Sciences contains not only the paper by Mercier and Sperber, but also a flurry of expert responses and then a response from the authors. SSRN does too, and there is a site devoted to this idea as well. Mercier sent me a more user friendly summary, and is allowing me to repost parts of it:

Current philosophy and psychology are dominated by what can be called a classical, or ‘Cartesian’ view of reasoning. Even though this view goes back at least to some classical Greek philosophers, its most famous exposition is probably in Descartes. Put plainly, it’s the idea that the role of reasoning is to critically examine our beliefs so as to discard wrong-headed ones and thus create more reliable beliefs—knowledge. This knowledge is in turn supposed to help us make better decisions. This view is—we surmise—hard to reconcile with a wealth of evidence amassed by modern psychology. Tversky and Kahneman (and many others) have shown how fallible reasoning can be. Epstein (again, and many others) has shown that sometimes reasoning is unable to correct even the most blatantly incorrect intuitions. Others have shown that sometimes reasoning too much can make us worse off: it can unduly increase self-confidence, allow us to maintain erroneous beliefs, creates distorted, polarized beliefs and enables us to violate our own moral intuitions by finding excuses for ourselves. We claim that the full import of these results has not been properly gauged since most people still seem to agree, or at least fail to question, the classical, Cartesian assumptions. Our theory—the argumentative theory of reasoning—suggests that instead of having a purely individual function, reasoning has a social and, more specifically, argumentative function. The function of reasoning would be to find and evaluate reasons in dialogic contexts—more plainly, to argue with others. Here’s a very quick summary of the evolutionary rationale behind this theory. Communication is hugely important for humans, and there is good reason to believe that this has been the case throughout our evolution, as different types of collaborative—and therefore communicative—activities already played a big role in our ancestors’ lives (hunting, collecting, raising children, etc.). However, for communication to be possible, listeners have to have ways to discriminate reliable, trustworthy information from potentially dangerous information—otherwise speakers would be wont to abuse them through lies and deception. One way listeners and speakers can improve the reliability of communication is through arguments. The speaker gives a reason to accept a given conclusion. The listener can then evaluate this reason to decide whether she should accept the conclusion. In both cases, they will have used reasoning—to find and evaluate a reason respectively. If reasoning does its job properly, communication has been improved: a true conclusion is more likely to be supported by good arguments, and therefore accepted, thereby making both the speaker—who managed to convince the listener—and the listener—who acquired a potentially valuable piece of information—better off.

That's the positive side of things. But there's a huge negative side:

If reasoning evolved so we can argue with others, then we should be biased in our search for arguments. In a discussion, I have little use for arguments that support your point of view or that rebut mine. Accordingly, reasoning should display a confirmation bias: it should be more likely to find arguments that support our point of view or rebut those that we oppose. Short (but emphatic) answer: it does, and very much so. The confirmation bias is one of the most robust and prevalent biases in reasoning. This is a very puzzling trait of reasoning if reasoning had a classical, Cartesian function of bettering our beliefs—especially as the confirmation bias is responsible for all sorts of mischief....Interestingly, the confirmation bias needs not be a drag on a group’s ability to argue. To the extent that it is mostly the production, and not the evaluation of arguments that is biased—and that seems to be the case—then a group of people arguing should still be able to settle on the best answer, despite the confirmation bias...As a matter of fact, the confirmation bias can then even be considered a form of division of cognitive labor: instead of all group members having to laboriously go through the pros and cons of each option, if each member is biased towards one option, she will find the pros of that options, and the cons of the others—which is much easier—and the others will do their own bit.

And worse still:

When people reason alone, there will often be nothing to hold their confirmation bias in check. This might lead to distortions of their beliefs. As mentioned above, this is very much the case. When people reason alone, they are prone to all sorts of biases. For instance, because they only find arguments supporting what they already believe in, they will tend to become even more persuaded that they are right or will develop stronger, more polarized attitudes.

I think this evolutionary perspective may explain one hell of a lot. Picture us around the campfire, arguing in a group about whether we need to move the camp before winter comes on, or stay in this location a little longer. Mercier and Sperber say we're very good at that, and that the group will do better than a lone individual at making such a decision, thanks to the process of group reasoning, where everybody's view gets interrogated by those with differing perspectives. But individuals--or, groups that are very like minded--may go off the rails when using reasoning. The confirmation bias, which makes us so good at seeing evidence to support our views, also leads us to ignore contrary evidence. Motivated reasoning, which lets us quickly pull together the arguments and views that support what we already believe, makes us impervious to changing our minds. And groups where everyone agrees are known to become more extreme in their views after "deliberating"--this is the problem with much of the blogosphere. When it comes to reasoning ,then, what's good for the group could be very bad for the individual--or, for the echo chamber.

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 LabX Media Group