In an article cheekily titled “We are all confident idiots,” David Dunning discusses a series of studies which show that education can lead people to believe they know more than they actually do. This is not limited to those who may pretend to know more than they do, although I think that the distinction between the deceptively confident and the genuinely confident is an important one. Disturbingly, Dunning seems to suggest that we might all, in some area, to some degree, be deluded about the extent of our own knowledge. The relationship of this phenomenon, named the "Dunning-Kruger effect," to what has been described as our Post Truth climate, intrigues me. As Stevie Seibert Desjarles points out in her post on Lee McIntyre’s book Post Truth, a central problem is not any single false belief, but rather the overall “corruption of the process by which facts are credibly gathered and reliably used to shape one’s belief about reality” (McIntyre 11, qtd. in Desjarles). What role does the Dunning-Kruger effect play in the devaluing of truth, what additional insights can we gain from the discoveries that have been made about it, and how can theory support our inquiry into this subject?
As part of a study co-run by Dunning, a group of high school students were tested both before and after taking introductory biology. These tests consisted of statements, some correct and some false, and a choice of responses ranging from “Strongly disagree” to “Strongly agree” and including “I don’t know.” The participants both agreed and strongly agreed with a higher number of correct statements in the second testing segment than in the first, which is the desired outcome of a class in introductory biology. However, they also expressed greater confidence in the answers they got wrong. When faced with an incorrect statement, they were more likely to select “Strongly agree” in the second testing segment, despite the fact that “Agree” and “I don’t know” were equally acceptable alternatives.
“The trouble with ignorance,” Dunning writes, “is that it feels so much like expertise.” Ideally, education would lead to a clearer realization of how much more there is to learn about a subject. But the ability to recognize what we don't know or where we can improve would sometimes, paradoxically, require the very expertise that we lack. Perhaps there should be a name for the threshold at which you begin to realize that taking one class in a subject might not call for a sense of total mastery over it. Incidentally, both the most competent and the least competent are likely to telegraph the same high levels of confidence, positioning both to more easily win their peers’ admiration and to attain positions of leadership. Unsurprisingly, this theory has been used to explain the ill-informed comments made by some of our political leaders.
An influx of facts does not, in itself, convince people for whom a false belief is strongly linked to worldview. As McIntyre also suggests, facts can be made to support whatever we already believe. This reminds me of Jean-Francois Lyotard’s illustration of why efforts to convince Holocaust deniers can fail. If they demand, as evidence, to hear eye-witness testimony about the gas chambers from someone who was actually in one, that would be impossible because anyone who was there wouldn’t have survived. As a consequence, neither the presence of eye-witness testimony nor the absence of it would sway them. This is part of Lyotard’s reasoning for rejecting past narratives about how we verify truth and thereby gain knowledge. Historically, philosophers attempted to pinpoint universal methods of explaining how we come to knowledge—a set of universal conditions for verifying truth, which we might conceptualize as an epistemological meta-narrative. In response, Lyotard argues for a case-by-case approach.
As a more recent and less extreme example, some people still mistakenly believe that Barack Obama was born outside the United States, despite the fact that his birth certificate was made public. The birthers' reasoning seems to be that if Obama were born outside the United States, then a grand conspiracy would’ve been necessary to conceal it. If such a conspiracy existed, then those responsible might also have the means to forge a birth certificate. Any mistrust of mainstream media would further taint any information that the media provided as potentially corrupt. Therefore, in the minds of some birthers, both the absence and the presence of a birth certificate could be interpreted as validation for their misbelief.
There is some hope in preventing the propagation of false beliefs, as both McIntyre and Dunning argue. Desjarles lays out the strategies that McIntyre proposes in her post, and Dunning offers similar tips. According to Dunning, going a step further than simply discounting a false belief-- say, by also providing an accurate set of facts to fill that void-- does increase the likelihood that the truth will be accepted. Intriguingly, the approach that Dunning identifies as having had the most success is the Socratic method. In the classroom, teachers can prompt students to pose questions about how an incorrect conclusion was reached. Going step-by-step through the reasoning to suss out any weak points in the logic can more effectively clear up misconceptions, make the truth more memorable, and strengthen analytical skills.
Moreover, in general, everyone can make an effort to root out any false assumptions that we might hold and to recognize fields of expertise about which wemight be stating stronger opinions than we truly have the expertise to give, areas where we could benefit from doing more research. With this objective, Dunning recommends asking someone to play the "devil’s advocate" if you're working in a group or even playing your own devil's advocate by posing counter-arguments to your own suppositions.
To that end, how can I pose a counter-argument to myself, here? Perhaps, in another sense, we aren’t confident enough about our potential “idiocy.” After all, to begin a new endeavor, sometimes you have to feel confident to take the risk of potentially looking like an idiot. You might be especially in need of an injection of confidence if you’re a woman. In their article “The Confidence Gap,” Katty Kay and Claire Shipman argue that women are more likely to experience Imposter Syndrome and less likely to convey the confidence needed to succeed in their professions. For instance, one study found that women were unlikely to apply for a job unless they met 100 percent of the qualifications listed, whereas men would apply if they met only 60 percent of the qualifications. A gender-focused study found that women are less likely to exhibit the Dunning-Kruger effect, and that as a result, women are less likely to apply for opportunities, in general.
Moreover, the fear of making mistakes or looking like an “idiot” can prevent any of us from taking potentially rewarding intellectual or creative risks. I posit that if you never look like an idiot, even in some small way; never try something that fails; or never admit to being wrong about something, then you might just not be leaving your comfort zone and ultimately, not trying hard enough. To paraphrase Spinoza, we do not yet know what a body can do. To this end, design theory urges those in pursuit of better-functioning prototypes to fail early and often.
This design adage seems especially appropriate if we shift our perspective to a Deleuzean epistemology. We can fall into the habit of viewing knowledge as a simple stockpile of verified facts, as much as our memory can contain, and sometimes that’s true. But it seems like the more that we can foreground the process by which meaning is made and become more aware of how we come to the conclusions we do, the better. And Deleuze's epistemological theory makes this process the focus. In What is Philosophy?, Gilles Deleuze writes that making meaning in philosophy entails more than simply committing verified facts to memory (and more, even, than making a prolonged effort to grasp a particularly dense passage of philosophical discourse). Instead, it's about creating concepts, and exploring the scope and usefulness of those concepts in the context of our interrogation of reality. But I think that’s an idea for another blog post.
HoTE Preview: Sayak Valencia "From Gore Capitalism to Snuff Politics: The Body as Mass Media"
March 3, 2020
HotE Review: Tim Dean’s “Hatred of Sex”
March 27, 2018
HoTE Review: Sayek Valencia’s "From Gore Capitalism to Snuff Politics: The Body as Mass Media"