This is a good article. I found it because I'm writing about the interactionist theory of reason too. Reading "The Enigma of Reason" completely changed how I think about thinking!
In my experience, most people's mistakes of reasoning are unintentional, as you say. And it is also true that philosophers and others that dedicate time to reasoning are good at spotting fallacies and other mistakes. But it's worth noting that philosophers are certainly not immune. Kant even dedicated a large part of the Critique of Pure Reason to identifying and diagnosing some incredibly persistent errors in reasoning he found in the history of metaphysics (the Transcendental Dialectic). So, I think we can extend your conclusion to include professional thinkers as well: we might assume that philosophers are guilty of rhetorical or disingenuous use of bad reasoning to support their pet theory, but I think there's a good chance that they too are unaware of some of the mistakes they make.
Yes, I agree. As I said in reply to another comment, there's no particular reason to expect or demand internal consistency and logical clarity. For the most part, people are just throwing out reasons as best they can to navigate their social ecology. The miracle, and the thing to be explained, is how we created institutions that strongly incentivise good reasoning (and which produce insights and truths in consequence). That's what science is.
I wonder whether an additional factor in explaining confirmation bias might be that confirmation bias is a byproduct of abductive cognition, which favors inferences that cohere with one's prior experience, presuppositions, beliefs, and that the inferential process ends once it has reached a point that the expectation of coherence is satisfied. (cf. "good-enough language processing" and, obviously, Relevance Theory).
There may be something of that going on indeed. But I wouldn't overstate it, because I don't see why we should especially expect coherence among one's beliefs. I think people commonly and ordinarily hold beliefs that are, if not directly contradictory, at least in tension with one another. Many religious and supernatural beliefs are examples. (I mentioned this in Part II of my essay 'On nature, science & culture'.)
Mutually incoherent beliefs can however be discouraged, if they lead to a loss of standing in one's social ecology. You could even argue that the primary achievement of science, as a collective enterprise, is to have created such ecologies. Truths emerge in consequence.
If (what I think are) bad reasons are persuasive for others, that just means the cognitive ecology of the audience (i.e. their prior beliefs) is quite different to mine. Whether that allows me to ignore the speaker's arguments depends on many things.
But the key point of my post isn't about ignoring people with bad arguments: it's about not getting angry with them. Because only rarely are they being duplicitous. More often than not, they are just sincerely wrong. That can matter, but it isn't as annoying.
This is a good article. I found it because I'm writing about the interactionist theory of reason too. Reading "The Enigma of Reason" completely changed how I think about thinking!
In my experience, most people's mistakes of reasoning are unintentional, as you say. And it is also true that philosophers and others that dedicate time to reasoning are good at spotting fallacies and other mistakes. But it's worth noting that philosophers are certainly not immune. Kant even dedicated a large part of the Critique of Pure Reason to identifying and diagnosing some incredibly persistent errors in reasoning he found in the history of metaphysics (the Transcendental Dialectic). So, I think we can extend your conclusion to include professional thinkers as well: we might assume that philosophers are guilty of rhetorical or disingenuous use of bad reasoning to support their pet theory, but I think there's a good chance that they too are unaware of some of the mistakes they make.
Yes, I agree. As I said in reply to another comment, there's no particular reason to expect or demand internal consistency and logical clarity. For the most part, people are just throwing out reasons as best they can to navigate their social ecology. The miracle, and the thing to be explained, is how we created institutions that strongly incentivise good reasoning (and which produce insights and truths in consequence). That's what science is.
I wonder whether an additional factor in explaining confirmation bias might be that confirmation bias is a byproduct of abductive cognition, which favors inferences that cohere with one's prior experience, presuppositions, beliefs, and that the inferential process ends once it has reached a point that the expectation of coherence is satisfied. (cf. "good-enough language processing" and, obviously, Relevance Theory).
There may be something of that going on indeed. But I wouldn't overstate it, because I don't see why we should especially expect coherence among one's beliefs. I think people commonly and ordinarily hold beliefs that are, if not directly contradictory, at least in tension with one another. Many religious and supernatural beliefs are examples. (I mentioned this in Part II of my essay 'On nature, science & culture'.)
Mutually incoherent beliefs can however be discouraged, if they lead to a loss of standing in one's social ecology. You could even argue that the primary achievement of science, as a collective enterprise, is to have created such ecologies. Truths emerge in consequence.
Very incisive. But what if the other people who are wrong and irrational are actually better at persuading others? Can you still ignore them?
Hi Gilbert!
If (what I think are) bad reasons are persuasive for others, that just means the cognitive ecology of the audience (i.e. their prior beliefs) is quite different to mine. Whether that allows me to ignore the speaker's arguments depends on many things.
But the key point of my post isn't about ignoring people with bad arguments: it's about not getting angry with them. Because only rarely are they being duplicitous. More often than not, they are just sincerely wrong. That can matter, but it isn't as annoying.