Implicit Bias | Part 5: Bias Feedback

Apparently, when I impersonate conservatives, I do it with a southern US accent (e.g., “‘Murica!”, “Don’t mess with Texas!”, etc.). I don’t intentionally adopt the accent. In fact, I never even knew I was doing it until my partner pointed it out to me! Without my partner’s third-person perspective, I might never have noticed. I might have just continued mocking people with southern accents. In fact, that wouldn’t be surprising given what we learned in this series [Part 1 – Part 5]. So if we want to do something about our biases, then we would do well to seek this kind of third-personal feedback. Let’s call it bias feedback.

1.  Feedback

The bias feedback I received from my partner can be characterized as bottom-up and informal. Bottom up because it came from a peer rather than from a position of authority. And informal because it happened freely in ordinary conversation rather than as part of some kind of compulsory process. Many people are uncomfortable with informal, bottom-up feedback. So if informal, bottom-up feedback is to be accepted in some contexts, then it might have to be integrated into that context’s culture. There might be a few ways to do this. Continue reading Implicit Bias | Part 5: Bias Feedback

Implicit Bias | Part 4: Ten Debiasing Strategies

At this point it’s pretty clear why someone would be worried about bias. We’re biased (Part 1). Consciously suppressing our biases might not work (Part 2).  And our bias seems to tamper with significant, real-world decisions (Part 3). So now that we’re good and scared, let’s think about what we can do. Below are more than 10 debiasing strategies that fall into 3 categories: debiasing our stereotypes, debiasing our environment, and debiasing our decision procedures.

1.  Debiasing our stereotypes

In the last post, we learned that implicit attitudes and stereotypes can badly affect our judgments. Here is how stereotypes can be formed and reformed.

Conditioning

In conditioning, we repeatedly present ourselves with a pair of stimuli until we begin to associate one thing with the other (De Houwer, Thomas, and Baeyens 2001; Hofmann, De Houwer, Perugini, Baeyens, and Crombez 2010). So, for example, if someone consumes media that repeatedly presents certain ideas or groups in a negative light, then they will cultivate a negative implicit attitude about these ideas or groups (Arendt 2013; Arendt & Northup 2015; Matthes & Schmuk 2015). Or if a certain profession is dominated by white men, then people will associate membership in this profession with being white and male — and this might have a self-reinforcing effect on the profession’s makeup.

Counterconditioning

We can also use conditioning against our existing biases. Some call this counterconditioning (Stewart, Latu, Kawakami, and Myers 2010; VansteenWagen, Baeyens, and Hermans 2015). One fairly easy and effective way to countercondition a negative stereotype is just to imagine a positive version of a negatively valence concept (Blair, Ma, and Lenton 2001, Markland et al 2015).

Here are examples of how we might use counterconditioning. When nominating, recommending, or choosing someone for an opportunity, we might alot some time to think of non-stereotypical candidates from underrepresented groups. If we’re teaching, then we might assign material from underrepresented groups. If we’re an office manager, then we might choose imagery for the office that represents a more diverse group of people.

2.  Debiasing our environment

It turns out that altering the decision environment can also support debiasing. So if you have a say in the decision environment, then think about the following.

Ask for justification

Put decision-makers in a position that forces them to justify their decisions to a third-party (Lerner & Tetlock 1999, Simonson & Nye 1992). The point of this is to prevent overconfidence in our judgment, e.g., optimism bias (Ben-David, Graham, and Harvey 2013, Moore & Healy 2008).

Consider alternatives

Try to diversify the decision-makers (Bell, Villado, Lukasik, Belau, and Briggs 2011; Shor, Rijt, Miltsov, Kulkarni, and Skiena 2015). The hope is that your intuitions will be challenged. After all, “Training in normative rules often fails when people have strong intuitions and do not pause to think more deeply (McKenzie and Liersch 2011)” (Soll, Milkman, and Payne 2014). So find people who question our intuitions (Koriat, Lichtenstein & Fischhoff 1980). If you can’t find someone to disagree with you, disagree with yourself: simply imagine arguments for different conclusions (Herzog and Hertwig 20092014; Keeney 2012).

Make information easier to consume

Remove irrelevant data, e.g.,  a job candidate’s identity, the status quo, unnecessary complexity, etc. (Célérier & Vallée 2014; Thaler & Sunstein 2008).  And present the remaining data in a way that allows for easy comparison (Heath & Heath, 2010; Hsee 1996, Russo 1977). And present the data in the most relevant scale (Camilleri & Larrick 2014; Burson, Larrick, & Lynch, 2009; Larrick and Soll 2008). Talk about probabilities in terms of relative frequencies or, better yet, represent probabilities with visualizations [examples] (Fagerlin, Wange, and Ubel 2005; Galesic, Garcia- Retamero, & Gigerenzer 2009; Hoffrage, Lindsey, Hertwig, and Gigerenzer 2000).

3.  Debiasing our procedures

Certain reasoning strategies can help you with debiasing (e.g., Larrick 2004; Fong and Nisbett 1991, Larrick, Morgan, and Nisbett 1990). Here are some ways that we might debias our decision procedures.

First, decision-procedures checklists, criteria, and rubrics. Use them, especially if you are repeatedly making one kind of judgment, e.g., grading, reviewing applications, etc. (Gawande 2010; Heath, Larrick, and Klayman 1998; Hales & Pronovost 2006).

Second, quantitative models. Predictive linear models often do better than our own reasoning (Dawes 1979 and Dawes, Faust, & Meehl, 1989; Bishop & Trout 2008). So if we have a good model by which to make a decision, then we should be wary of deviating from the model until we have good reason to do so.

Finally, the status quo. We tend towards the status quo. So if it is already well-known that a particular choice optimizes our desired outcome(s), then default to that choice (Chapman, Li, Colby, and Yoon 2010; Johnson, Bellman, and Lohse 2002; Johnson & Goldstein 2003; Madrian & Shea 2001; Sunstein 2015). And — going back to section 2 — if people want to opt-out of the optimized status quo, then ask them for justification. 😉

Conclusion

As you can see, there are many debiasing tools in our toolkit. We can debias our stereotypes, our environment, and our procedures. While these tools have been shown to help, they might not entirely solve the problem. And there maybe be at least one more option at our disposal: feedback. That will be the topic of Part 5.

Series Outline

Part 1: I talk about the anxiety I had about being implicitly biased. “Am I discriminating without even knowing it?! How do I stop?!”

Part 2: What is implicit bias? Check out this post to learn about the theory and some of the findings.

Part 3: What are the real-world consequences of implicit bias? Research suggests implicit bias can impact hiring, pay, promotions, and more.

Part 4: [Jump to the top]

Part 5: Can feedback help? What if people are not used to giving each other feedback about their biases. Check out how to give and encourage feedback about our biases.

Implicit Bias | Part 3: Workplace Bias

Think about decisions that people make every day. A committee decides who to hire. A supervisor rates an employee’s performance. A teacher grades a student’s assignment. A jury arrives at a verdict. A Supreme Court judge casts their vote. An emergency medical technician decides which victim to approach first. A police officer decides whether to shoot. These are instances in which workplace bias can have significant consequences.

I won’t be able to highlight every area of research on workplace bias. So I cannot delve into the findings that police officers’ sometimes show racial bias in decisions to shoot (Sim, Correll, and Sadler 2013, Experiment 2; see Correll et al 2007, Ma and Correll 2011 Study 2 for findings that indicate no racial bias). And I cannot go into detail about how all-white juries are significantly more likely than other juries to convict black defendants (Anwar, Bayer, Hjalmarsson 2012).

1.  Gender bias at work

Instead, I’ll focus on the instances of workplace bias to which most people can relate. If you’re like most people, then you need to work to live, right? So let’s talk about how bias can affect our chances of being hired. Continue reading Implicit Bias | Part 3: Workplace Bias

Implicit Bias | Part 2: What is implicit bias?

If our reasoning were biased, then we’d notice it, right? Not quite. We are conscious of very few (if any) of the processes that influence our reasoning. Some unconscious processes bias our reasoning in measurable ways. This is sometimes referred to as implicit bias. In this post, I’ll talk about the theory behind our implicit biases and mention a couple surprising findings.

The literature on implicit bias is vast (and steadily growing). So there’s no way I can review it all here. To find even more research on implicit bias, see the next two posts, the links in this series, and the links in the comments.† Continue reading Implicit Bias | Part 2: What is implicit bias?

Implicit Bias | Part 1: Bias Anxiety

The research on bias is kind of scary. It not only suggests that we are biased; It suggests that we are unaware of many of our biases. Further, it suggests that trying to suppress our biases can easily backfire. So, despite our best efforts, we could be doing harm. And yeah: that might provoke a bit of anxiety. That’ll be the topic of this post.

In future posts, I’ll talk about the theory behind our biases [Part 2], how bias impacts the workplace [Part 3], a dozen debiasing strategies from the research [Part 4], and a few tips for giving (and receiving) feedback about our biases [Part 5].

Related post: The Bias Fallacy (what it is and how to avoid it).

Continue reading Implicit Bias | Part 1: Bias Anxiety