Podcast: Play in new window | Download (Duration: 31:23 — 25.2MB)
How can we become more aware of our own biased thinking? Mahzarin Banaji, co-author with Anthony Greenwald of Blindspot: Hidden Biases of Good People, helps us uncover biases and stereotypes ingrained since childhood. Discover why racism and antisemitism aren’t dead, how we can dilute stereotypes, and the effectiveness of rational arguments against tightly held beliefs.
Tweets
- Everyone is biased – how can we spot the blind spot? Mahzarin Banaji weighs in on the #podcast
- Are rational arguments really useful against #stereotypes? #podcast #blindspot
Social Media
Book: Blindspot
Website: SpottheBlindspot.com
Take the test: Implicit.harvard.edu
Bio: MAHZARIN R. BANAJI received her PhD from Ohio State University and was a postdoctoral fellow at University of Washington. She taught at Yale University for 15 years, receiving the Lex Hixon Prize for Teaching Excellence. She is currently Richard Clarke Cabot Professor of Social Ethics in the Department of Psychology at Harvard. She served as the first Carol K. Pforzheimer Professor at the Radcliffe Institute for Advanced Study. At present, Banaji also serves as Cowan Chair in Human Social Dynamics at the Santa Fe Institute. She is the recipient of a Guggenheim Fellowship, the Diener Award for Outstanding Contributions to Social Psychology and is Herbert Simon Fellow of the Association for Social and Political Psychology.
Video
Transcript
Peter: Welcome to the Bregman Leadership Podcast. I’m Peter Bregman, your host and CEO of Bregman Partners. This podcast is part of my mission to help you get massive traction on the things that matter most. With us today is Mahzarin Banaji. She has written the book Blindspot, Hidden Biases of Good People. She’s a psychology professor at Harvard. The book is fascinating. It’s fun because there’s all sorts of tests in the book that reveal your own biases uncover your own blind spots. That’s what we’re here to talk about with Mahzarin. Mahzarin, welcome to the Bregman Leadership Podcast.
Mahzarin: Thank you very much for having me, Peter.
Peter: What is a blind spot?
Mahzarin: All right. It’s actually quite simple. The reason we selected the term blind spot is because it’s a word that’s well-known to people, and it has a very specific meaning. We all know that our eyes, our retina consists of a blind spot. On it is a blind spot. When information falls into it, we don’t see it. We use this word as a metaphor to speak about many aspects of our minds where information may reside unbeknownst to us that may affect our decisions but that we do not know about. That’s the only reason we use it. Increasingly these days, I extend this out to speak about the blind spots that we also encounter, most of us everyday in the blind spot that our cars have.
The reason I like the automobile example even better is because that one, we are fixing. We are figuring out ways of giving people information that otherwise would be lost because it’s in the car’s blind spot. By doing so, we know that we will make driving safer. That involved a certain kind of outsmarting. We know that the car will have a blind spot. Our eyes will have a blind spot. Likewise, we will have biases. They’re a part, an integral part of our daily thinking and decision-making. We can’t just wish them away or even want to have them all disappear because in many cases, they may serve a very useful function.
Peter: You’re reducing the defensiveness around the blind spot which is such a critical piece.Everybody has them like a car has them. You don’t look at a car and say, “That car is not a good car because it has a blind spot.” You just expect it’s going to have a blind spot that you have to work around. Can you describe a couple of examples of some blind spots? Then, let’s talk about some ways of getting at them that may short circuit the challenge of defensiveness and also help ourselves, help others see what’s in that blind spot.
The best example I can give you is a riddle that we have used that you, I’m sure, are familiar with and your listeners are familiar with. It’s a very simple riddle, and it goes something like this. A father and his son are in car accident. The father dies. The son is rushed to the hospital. The attending surgeon says, “I can’t operate on this boy. He’s my son.” You ask people, “How can this be?” We discovered that even recently, 80% of people cannot answer this riddle. I couldn’t answer it in 1984. When I was asked this riddle, I said, “It’s possible that the father who died in the car was the adoptive father, and then the surgeon who was called in to do the surgery could have been the biological father.” Increasingly these days, people give a very interesting and correct answer. That is two dads. Certainly two dads is logically possible, but you and I know that the much more statistically correct answer, by correct I mean just more probable, is that the surgeon was the boy’s mother.
Why does this escape us? Why does it escape people whose mothers are surgeons? I just met a woman who didn’t get the right answer and was absolutely annoyed because her mother is a surgeon. She could not think of this answer.
Peter: How can that be a blind spot of a person whose mother is a surgeon?
Mahzarin: Because the culture leaves its thumbprint on our brain. That thumbprint is so powerful that it can even obliterate your own personal experiences because you can say … Your mind probably says, “But that’s my mother. She’s very special.” You put her in a different little place in your brain and she doesn’t count. What’s interesting is that as the number of women surgeons is increasing, you know, it’s something like 80:20 or 70:30, why does our brain act as if it’s 100:0? It does. That’s the way it works. Knowing that is important. Fixing it so that you don’t rely on the stereotype when it comes to thinking about possible people is very important.
One of the examples that we have of this bias in action is the story of a woman called Tamika Cross who was on a flight. When the flight attendant asked for a doctor who could help a patient, and Tamika Cross offered help and was told to please sit down and not bother her because the flight attendant was looking for a real doctor. Several times, you know, Dr. Cross offered her help and was denied until somebody, another doctor, I think, figured out that she was one of the tribe. I think of this as a very interesting extension of the riddle problem, that we capture the riddle problem in some abstract way, but these are the ways in which blind spots actually materialize in ways that could cost somebody their life.
Peter: I was teaching my daughter about moral licensing. This idea that if I do something good, it almost gives me permission to do something bad, that that’s how our mind works and that it happens with bias, right? If I have a friend who is Jewish. I’m Jewish, so that’s an easy one for me to use. I have a friend who’s Jewish. Then, i could say, “Look, I’m not anti-Semitic. I have a friend who’s Jewish.” That allows me to psychologically be discriminatory towards other Jews because I’m not one of those. You know, I’m on the right side.
Mahzarin: Some folks just call this subtyping. The idea is, the word is subtyping, you take somebody who doesn’t fit the stereotype, and you could do two things with it. You could accurately update your stereotype so that your stereotype now becomes weaker. Instead, we put the unusual one in a separate little bucket so that we can keep the original stereotype intact. That’s certainly one theory of how we do this. Yes, my mother is a surgeon, but that’s my mother. Some of my friends are Jewish, but they’re not like other Jews. They don’t keep kosher or whatever it might be. I think it’s actually good for you to bring up anti-Semitism in this context because I think it’s on the rise. I think we should talk about it because so many Americans believe that it’s gone, that a case between Boston and Washington, you really would not see any evidence.
Then of course, we have to explain how the day after the elections, so many New England towns seemed to sprout Swastikas in public places, showing us that these stereotypes were always there and that different events in our culture allow the stereotype to pop out. That’s a very powerful function that senior leaders have in their hands, that presidents and CEOs and leaders have because what they’re doing by their words and by their actions is saying, “You can release the control that you otherwise exert on your thinking.”
Peter: Of course, it’s the same for Islam or Muslims. You know, I have a friend who’s Muslim, but they’re a different kind of Muslim than these other Muslims. You could use the evidence of your relationship with someone to dilute your stereotype or to reinforce your stereotype by creating a distinction between the person you know and everybody else in their tribe.
Mahzarin: That’s exactly right.
Peter: I love this idea of diluting the stereotype or diluting the blind spot and recognizing that when there’s evidence contrary to the thinking that you have that it’s not just an either/or. You said 0:100. It could be that, you know, most women are shorter than most men. I mean I don’t know what the statistic is around that, but when you meet taller women than men, you realize that that actually doesn’t hold water in terms of making decisions about a woman or a man or size in general.
Mahzarin: That’s right. The brain is a categorizing machine. It loves categories. We feel very good when we can see green as different from red. We like to put people into those categories. You remember the Saturday Night Live skit that used to involve this androgynous person called Pat. Pat would come very close to reveling his or her gender but never would. Every viewer wanted to know. We all want to know. Is pat male or female? That’s so important to us. We ask that as the very first question when a parent says they’ve had a child. We say, “Girl or boy?” It’s impossible, even from day one, we want to know where to place them.
Peter: How do you help reveal your own blind spots or the blind spots of others without heightening their defensiveness or while getting around the defensiveness so that people can actually see them?
Mahzarin: You raise a very good question. Look, why should people not be defensive when you tell them about their blind spots, especially the kind we do, which is not even about, “You don’t take criticism well,” which is one that I think would be a lot harder, a lot easier to deal with than the harder things that we say. We say, “You’re not the good person you think you might be,” or rather you’re good at the conscious level. Somewhere there in your brain is information. That information, without your knowledge, is a part of who you are. You can deny it all you want, but we can show you it’s there. Of course people are going to kick and scream. Why shouldn’t they? If somebody said that’s to me, in fact, I showed real resistance to the result of my first test. When I took the test, the first IAT in 1994-
Peter: Describe to our listeners what an IAT is.
Mahzarin: Yes, the Implicit Association Test is something that I’ve co-developed with Tony Greenwald who actually invented the test, sent it to me by e-mail and asked me to just take it. Both of us had been collaborating for many years on building these kind of methods that would allow us to get at what lies outside of our conscious awareness. I took the test, having taken many others, but none other was like the IAT or the Implicit Association Test. This was a test that required me to sort faces into two categories. Hit the left button if you see a black face. Hit the right button if you see a white face. Easy enough. Then, I was told to put things into two categories that are good and bad things. Words like love and peace go to the right, and words like devil and bomb go to the left. Easy enough to do. Then, I was asked to do them together. White and good goes to the right. Black and bad goes to the left. Easy peasy.
Then came the switch where I was asked to associate white with bad and black with good. Easy peasy, I thought, until I actually tried to do it. My fingers couldn’t find the right key. I couldn’t remember. I took much longer to complete the test. I made many more mistakes, and I was in a sweat. My first thought was, “Something’s screwed up with the test.” It took me a full five seconds to realize. This is after doing this kind of research for 20 years. It took me a good five seconds to realize that maybe it wasn’t the test that was screwed up. Maybe it was my head that was screwed up, and yet I took the test in many different forms before I was willing to actually believe that it was telling me something real about myself, but something that I was not going to be able to see.
I want to go back to this question you asked about resistance. I think we should try to place this resistance in the context of many, many, many other resistances that our species has shown in the face of difficult information. For my students, I often begin with Galileo, and I say, “Okay, here’s a great man. He finds a telescope, and he takes it away from children who are playing within the streets. He turns it to the heavens, and he sees the moons of Jupiter are dancing around in ways that the Vatican is not going to like.” Talk about resistance. It took the Vatican 350 years to write an article that said that they agreed with Galileo. Who am I to complain when people don’t believe what it is that we’re saying? Look, in psychology, just as with moral credentialing that you spoke about, there are thousands of little results that I think all add up to a very similar picture of human nature.
Take the Milgram experiments that human beings will shock other people and put them through intense pain just because somebody’s asking them to do that. Now, isn’t that a shocking result? When you tell people that they’re likely to do that or that 66% of them are likely to do that, of course they’re going to resist that. I’m not that one. I would be the good person. Take the Darwinian revolution. We still haven’t come to terms with Darwin. You know, I still go to places where they’ll ask me to say that evolution is just a theory. Look, anything important that says we are not special. Our planet is not at the center of the universe. Ur species was not placed here by God himself as the best one. All of these have met with resistance, and what we’re saying, as psychologists today is yet one more step in the same direction. We are not the good people we think we are.
Peter: It makes sense that we would have that resistance. The question then is how do we help ourselves get over that resistance? How do we help others get over that resistance so that we can show up as the people we want to be, because if we stay in that resistance, then we’ll be in what we call unconscious incompetence, right? We will, in effect, have maintained a really happy view of ourselves while doing damage in the world. The question is how do we get over the defensiveness, the resistance to seeing that so we could actually become the people that we see ourselves to be?
Mahzarin: Right. The district attorney of New York City, Cyrus Vance said it very well. He said, “There is no shame in having implicit bias. There is only shame in not doing something about it.” I very much agree with that because what’s the value of knowledge, new knowledge if we don’t do something with it? Look, how do we change minds in general? Sometimes, technology helps. Galileo had a telescope. It’s very hard when you stick people’s faces in front of a piece of glass and have them look at Jupiter and see something and still deny it. I always envy scientists who have a technology like that because it’s so much easier for them than it might be for people like me who are dealing with things that are a bit hidden. In a sense, the IAT is like a baby telescope. It is incredible crude. It is not in any way direct. The interpretation of Galileo’s results went on for centuries, right? It wasn’t the data. The data were there, but people were trying to explain them off in all these different ways. We’re seeing that exactly with the IAT.
The New York Magazine wrote an article. Somebody who’s not an expert spoke to a bunch of non-experts who just decided that this can’t be true because it doesn’t agree with their politics. We’re going to face exactly the same issues as anybody saying something difficult. Your question is, “So how do we move forward?” I do believe that the IAT is an arrow in our quiver that is priceless because 30 million tests have been taken, and we are the sole recipients of e-mail that comes to us from the people who have taken these. I can give you just a few examples.
Yes, we have naysayers. Yes, we have Neo-Nazi groups that target the lab, but we also have thousands and thousands of people who have written to say that it changed how they thought about themselves and that they have taken action to do certain things. Now, I’m not going to put any credence because we don’t have studies that have been done to show, “Did they really change?” I believe that there is a coming-to-terms with it. I’ll give you an example. 25 years ago, I would ask my introductory psychology class a question. “Are you a biased person,” on the first day of class. Something like 1% of students would answer yes. Today, over 65% say yes. I think that’s not because they are more biased today than they were 25 years ago. If anything, I think it’s the opposite, but I think today, there is a deep recognition that we are biased. It comes from people like Danny Conoman, Herb Simon, the huge amount of work that’s been done in the decision sciences, the moral credential work you mention. All of that together, I think, is teaching us that we are boundedly rational.
Peter: It’s the behavioral economics that says that we don’t act in rational ways. Although what you’re saying which is also very interesting is that in order to help people recognize the irrational ways in which they may respond or act to expose their hidden biases, we have to use very rational responses to it. We have to show evidence that this is what’s existing, and that in the face of that evidence, their rational minds could interact with their biases in such a way to say, “Wow, I see what’s happening here. I do. I get it.” As long as we can appeal to the rational in disobscuring, in revealing the irrational, then we believe it. That’s part of what I’m hearing from you.
Mahzarin: I agree, but you and I know so well that rational arguments are limited. When I speak, I have to decide how much of just the evidence and the evidence and the evidence I should present. I know what that would do. Certain people will believe. Even everybody would believe the data, but they would not say, “That’s me.” They would say, “Yes, that’s true of other doctors. Other doctors don’t give black men the same level of pain killers as they give white men, but that simply isn’t true of me.” My frustration with organizations and institutions is that they’re not collecting data on individuals and showing it back to them.
Think about this. We know that there is enormous evidence now that pain medication is prescribed differently to African-Americans and white Americans, even though they report the same level of pain. This is true across the country, in many parts of the country so it’s not just in particular regions. It’s true up and down the disease spectrum, in every disease that you can see where [inaudible 00:23:59] is involved. To me, the solution to this problem is extremely obvious. First of all, hospital systems have the data. They know it, okay? They will speak about it in hush tones. My feeling is that if we really want to be rational, and with doctors, there is no reason not to be because doctors are absolutely consciously unbiased. They take an oath. Unlike any other profession, they take an oath that they will treat everybody the same and will treat them in order to relieve their misery.
What can they do? I think it’s very simple. I think a hospital system needs to have two little graphs, two little bell curves that show the average pain killer given to black people and white people in that hospital system from their own data. Whenever a doctor is prescribing a pain medication, that little graph should pop up. We should say to the doctor, “You get to decide what you want to do as an expert. Just be aware that in our hospital, this is the average milligrams we give to white people. This is the average milligrams we give to black people, and you decide.” I’m very interested in trying out what I’m calling interventions in the moment, because teaching people in a two-hour seminar is not going to get them to change their behavior. What we teach them is too far removed from their daily actions. I do believe that if we can insert these reminders in the moment that we could see dramatic changes in behavior because conscious attitudes are pure.
Peter: it would be really cool to have a graph that says, “Here is the average of what we’re prescribing. Here’s where you fall in that average.”
Mahzarin: I would love to do that. The technology allows that. You type in the letters, Percocet and [up-ken-pop 00:26:16], the graph for all pain killers or just Percocet. This is not hard to do. One of the things that I will agree with you on on change is that I find many organizations screaming about how much they would love to be able to fix this. “Tell us, what can we do.” When we get close to telling people what they might do, I’m not seeing them do it. I think that we are in a moment when either we will break out of it and do it or we won’t and because it is now a question of will. As I always say to them, “You don’t need yet another study.” There are now thousands of studies using resumes, for example, that show that with exactly the same resume, people get very different callbacks.
My colleague, Devah Pager did a study where she sent black and white men in person applying for jobs. She showed that the likelihood that a black man in New York City was going to get a job was exactly the same as that of a white man of the same age, same education level but who was also a felon. Now, in the mind of the hiring manager, these two become equal. When I say hiring manger, I mean you and me. I mean us. I mean that in my mind, the two now look roughly the same. I ask myself, “Which one would be the better deal?”
Given those data and given that there are hundreds of these studies that are called audit studies where we see differences in hiring and promotion. We see differences in whether you get a loan in a bank or nor, whether you’re told you can stay on a bus if you don’t have enough money or you have to get off the bus where doctors look at the same CAT scans and decide differently about whether you need surgery or not. I think the data are there. It’s now a matter of deciding that we would like to do something about it.
Peter: Interestingly, it seems like closing that gap has to do as much with our own personal development, drive for growth, humility, as anything else. The willingness to look at our own behavior and say, “I don’t know. I have things I need to learn. I can get better at this. I will believe that I have these biases or I have these blind spots.” The blind spots could be cultural as you’re describing most of these or they could be personal. Somebody says I yell at meetings. We have to approach it with a sense of humility that says, “I’ve got something to learn here, and it’s worth learning.”
Mahzarin: Yes, but why should you believe them over what you know about yourself? Here is my claim. In the individual case, you are just as right as they are. We don’t know. It’s your word against theirs which is why I think you should want to collect data. I think you should tape your meetings. You should have a blind coder count how many times you yell. I have had people sitting in the back of my classroom counting who I call on. We show them the data, yes.
Peter: Bringing data out of the lab and into the actual work environment.
Mahzarin: I call this the small data movement.
Peter: I know. I love that. I think it’s important. I could think of clients for whom recording their meetings and then playing it back to them would surprise them. And using a blind coder which is the example you give having women in orchestras play behind a shade. You don’t know if it’s a man or a woman. Suddenly, more women are getting hired into orchestras than they would otherwise.
Mahzarin: Yes. Somebody other than the harp player, yes.
Peter: My guest today is Mahzarin Banaji. Her book is Blindspot: Hidden Biases of Good People. Mahzarin, it’s such a pleasure having you on the Bregman Leadership Podcast. Thank you. You have a lot to teach all of us, including me. I’m so happy that you’ve been here as a guest.
Mahzarin: I learned a lot. Thank you very much, Peter.
Peter: If you enjoyed this episode of the Bregman Leadership Podcast, please subscribe and leave a review on iTunes. For more information about the Bregman Leadership Intensive, as well as access to my articles, videos and podcasts, visit peterbregman.com. Thank you to Clare Marshall for producing this episode and to Brian Wood who created our music. Thanks for listening, and stay tuned for the next great conversation.
I was glad to see that the doctor admitted that bias and personal perception of bias take a long time (25 years) to change. The people who expect changes to occur instantly are fooling themselves.
This podcast completely changed how I view things. It enlightened me that having biases is normal. It taught me about how to help realize the biases we inherit with the environment that we grow up in. The instances Doctor provided were so effective and real. With the progress of the podcast, I could feel the epiphany to my bones. It was an eye-opener to me.