If our reasoning were biased, then we’d notice it, right? Not quite. We are conscious of very few (if any) of the processes that influence our reasoning. So, some processes bias our reasoning in ways that we do not always endorse. This is sometimes referred to as implicit bias. In this post, I’ll talk about the theory behind our implicit biases and mention a couple surprising findings.
The literature on implicit bias is vast (and steadily growing). So there’s no way I can review it all here. To find even more research on implicit bias, see the next two posts, the links in this series, and the links in the comments.†
1. Implicit Associations
There are a few ways to discover and gauge implicit biases. One way is the implicit association test (IAT) [Wikipedia]. The IAT is a categorization task. The task reveals how we associate certain categories — e.g., race, gender, religion, etc. — with certain feelings — e.g., positive feelings, negative feelings, etc. These associations are …well, discouraging. For instance, the race IAT reveals that people are faster to (a) associate a positive word with a white face than a black face and to (b) associate a negative word with a black face than a white face (Greenwald, McGhee, and Schwartz 1998). Here’s the kicker: even people that are committed to not being racist seem to be implicitly racist — e.g., black people often show an implicit preference for white faces on the race IAT.
Consider the implications. First, there is evidence that we are biased. Second, there is evidence that we’re not even aware of our bias. And third, there is evidence that we might be biased despite our best intentions to not be biased.
Disclaimer: the IAT is not well-suited to predict individuals’ propensity for biased behavior. The IAT is more useful for predicting group or system-level behavior (Greenwald, Bajani, and Nosek 2015). So don’t read too much into your own IAT results. Rather, consider the bigger picture (ibid.).
2. The Automaticity of Assocaions
One explanation of this evidence is that racial features automatically trigger positive or negative attitudes (Fazio, Sanbonmatsu, Powell, Kardes 1986). The idea is this: if a thought is associated with a positive or negative feeling, then thinking that thought quickly and automatically triggers positive or negative attitudes and these attitudes influence our reasoning and judgments — often without our conscious awareness of it.
It’s like those times when someone does something that you really dislike. Your friend tells you that they’re voting for the other political party. The referee penalizes your favorite team. Or my pet-peeve: someone makes claims that do not follow from the evidence(!). You feel upset as soon as it happens. Your heart rate increases and a sense of internal pressure builds. You’re frustrated, threatened, and/or maybe even stressed. You didn’t reason your way to these feelings. They just happened.
3. Reasoning About Our Implicit Biases
Notice that this theory about implicit bias makes testable predictions! After all, if implicit bias can be the result of processes that we do not consciously endorse, then there is more to bias than our conscious reasoning.
And that is what experimental psychologists find. Researchers find people who behave in implicitly biased ways. Then researchers present these people with good reasons to not be biased. Sadly, these reasons — if they have any effect at all — have only a small affect on peoples’ implicitly biased behavior (Briñol, Petty, and McCaslin 2009: 291-302; Briñol, Petty, and Horcajo n.d.).
Reasoning About Racism
Consider an example. Gretchen Sechrist and Charles Stangor tested 50 undergraduate psychology students for racial bias (according to the Pro-Black Scale, see Katz & Hass 1988; see Sechrist and Stangor 2001). The students whose behavior was most biased were told one of two things: either (a) most of their peers accepted racial bias or (b) only a small minority of their peers accepted racial bias. After that, the students were asked to wait in a chair in the hallway. In the hallways was a line of seven chairs and an black student was sitting at one end of the chairs (see Figure 1).
So, the students had to make a choice about where to sit in proximity to a peer who is black. And they had to make this choice right after they were told about their peers accepting or rejecting their racial bias.
Researchers found that students would sit closer or further away from a black peer depending on whether their peers accepted or rejected racial bias. When students were told that 81% their peers accepted racial bias, they sat further away. When they were told that only 19% their peers accepted racial, then they sat closer. The suggestion is that participants were encouraged to inhibit their own racial bias in response to their peers’ views about racial bias.
Reasoning Isn’t Enough
Unfortunately, this had only a small effect on participants’ bias. In fact, the number of chairs by which the two groups differed was just one. One!
In my experience with the literature, the effect sizes of reasoning on implicit attitudes are usually small. (And sometimes the effect sizes are not even reported!)
This suggests that our conscious reasoning might be limited when it comes to reducing implicitly biased behavior. That is the fourth major point of this post: Our desire or reasons to be less biased might not be enough to fully overcome our biases.
So now that you’ve been introduced to the literature, remember:
- We can be biased.
- We can be unaware of our bias.
- We can be biased despite our best efforts.
- We can be biased despite being aware of reasons not to be biased.
Part 1: I talk about the anxiety I had about being implicitly biased. “Am I discriminating without even knowing it?! How do I stop?!”
Part 2: [Jump back to the top]
Part 3: What are the real-world consequences of implicit bias? Research suggests implicit bias can impact hiring, pay, promotions, and more.
Part 4: What can we do about implicit bias? The research on de-biasing suggests a few strategies for limiting biased judgments and behavior. Spoiler: many people and institutions aren’t using these strategies.
Part 5: How can we address each others’ biases? It’s time for Feedback 101, featuring tips for developing a culture of helpful feedback and some examples of how to give feedback to one another.
† I am not being careful to distinguish between implicit bias and implicit attitude. There might be important differences, but they are not crucial for the purposes of this post.