Sunday 30 August 2020

Wanna tame your biases and prejudices: get out of your comfort zone

  

Tiffanie Wen

Police forces and many other institutions are turning to implicit bias training to help their staff recognise when they are relying upon racist assumptions and stereotypes. But does it really work?

The killing of George Floyd by police officers in Minneapolis three months ago and the shooting of Jacob Blake by police in Wisconsin have led the US to a period of reckoning. As thousands have marched in the streets to protest against racial inequality, many others have also been forced to ask some difficult questions about their levels of prejudice.

While some people mistake racism as being only overt prejudice, there is another crucial component that affects our decisions and actions towards others: implicit bias. An implicit bias is any prejudice that has formed unintentionally and without our direct knowledge – and it can often contradict our explicit beliefs and behaviours. Usually, it reflects a mixture of personal experience, attitudes around us as we have grown up, and our wider exposure to society and culture – including the books we read, television we watch and news we follow.

Many police departments in the US have pointed to schemes aimed at tackling implicit bias as evidence of their attempts to root out racism from their ranks. It is an appealing approach – police forces face many challenges when it comes to tackling racism among their officers. Powerful unions and state laws can protect police officers from investigations into misconduct, while officers who have been fired or have resigned in the past are often rehired by other forces that may be unaware of their career history. Dealing with these systemic issues often requires major structural and institutional change, while training individuals to recognise their own unconscious biases can seem relatively easy to implement by comparison.

While kneeling on a man’s neck until he stops breathing is an extreme act, implicit bias can lead to many forms of discrimination, and can often go unnoticed by those perpetrating them. It can affect how everyone in a society – not just police officers – behaves towards one another.

But the evidence for whether implicit bias training can work is mixed. Is it really possible to reduce biases we are barely aware we have? And can we really hope to rid ourselves of them completely?

The first step is recognising you have biases in the first place. One indication of implicit bias is something known as the implicit association task (IAT). I first encountered the IAT in 2004 as an undergraduate research assistant in a social cognition lab at the University of California, Davis. Developed in the mid 1990s, it is basically a categorisation game, where a series of words and/or pictures flash up on a screen and they have to be sorted as quickly as possible. For a racial bias IAT, you might see a jumbled series of words that have positive or negative connotations mixed with a jumbled series of black or white faces. Mistakes and timing are tracked.

Even if we don’t consider ourselves to be biased, our behaviour, tracked down to milliseconds, usually says otherwise

If you are like a lot of Americans (of all races), you will probably tend to categorise white faces with good words on one side and black faces with bad words on the other.

In 1998, Harvard University made the test available online as part of the research they conduct for Project Implicit. So far, more than 25 million people have tried the IAT there. Calvin Lai, director of research at Project Implicit, says participation has “increased dramatically with the resurgence of the Black Lives Matter movement”.

Taking the test can be an eye-opening experience. Even if we don’t consider ourselves to be biased, our behaviour, tracked down to milliseconds, usually says otherwise. Lai says that their data from 2007 to 2015 shows that 73% of white people, 34% of black people, and 64% of people of other races have a pro-white, anti-black bias. This bias is so pervasive that even children as young as four, of all races, show a bias in favour of white people.

In our evolutionary past, implicit biases may have helped us make quick assessments about other people, animals or situations so we could decide to fight or run away. Today, however, they can lead us to discrimination – or worse.


Police departments are also not alone in hoping tackling unconscious bias can bring about change. Multinational corporations such as Starbucks have mandated implicit bias training in response to racist incidents involving their employees, and the BBC has made it mandatory for staff to go on an unconcious bias course.

Their deep-rooted nature means they can be especially hard to overcome.

“Research shows that it’s easy to change attitudes for a short period of time, maybe a few hours, but it’s hard to change for more than a day,” says Jeffrey Sherman, a psychologist who ran the lab where I worked at the University of California, Davis.

Still, studies point to some strategies that might work.

In a 2012 study, participants were made aware of their bias by taking the IAT before viewing a presentation that discussed their own level of bias, how implicit bias can lead to discrimination and negative consequences for minorities. They then learned about cognitive strategies for reducing bias. Two months later, the participants had lower IAT scores, and indicated they had more concern about bias and awareness about bias in their own behaviour.

Among the strategies that were used for reducing bias were exercises aimed at individuation, which is used to test an assumption that people might commonly make about someone.

Patricia Devine, a psychologist at the University of Wisconsin-Madison who led the study, poses a situation in which a tall, young black man is walking on a college campus. “A student might assume he’s on the basketball team,” she says. In this situation Devine suggests if people check the assumption, they will likely realise there is no evidence other than the stereotype.

For every one person who confirms a bias, it takes three to disconfirm a stereotype to balance it out – William Cox

Devine says another approach can be to counter or replace stereotypes. But combating stereotypes is not easy. Research due to be published later this year by Devine and her colleagues Xizhou Xie and William Cox, shows encounters that contradict strongly held stereotypes get weighed less heavily in our mind. “We found that for every one person who confirms a bias, it takes three to disconfirm a stereotype to balance it out,” says Cox.

Returning to our example, that would mean the student would need to come across more than three black non-basketball players for every black basketball player they encounter to reduce their bias.

Devine isn’t sure we can ever fully get rid of our biases. She and other experts think that a better approach may be to use the strategies to better manage our behaviour.


In a 2017 follow up to their 2012 study, she and her colleagues found that students who completed the bias habit-break intervention they had developed did not differ in their level of bias on the IAT from controls who did not complete the intervention – both groups decreased their bias. But they did find that students who completed the training were more likely to publicly disagree on social media with what they thought was an online newspaper column in favour of racial stereotyping, two years later.

So while the student’s score on the IAT did not appear to change, their long-term behaviour did, perhaps because they were more motivated to recognise and challenge bias, argue Devine and her team.

“I don’t know if we could ever get rid of those underlying associations, but we can make them less powerful in our thinking,” says Devine. “We can learn to recognise the ways in which (biases) lead us to conclusions that we understand are not appropriate or unwarranted, and then we can do something else that reflects your intentions and your values much more.”

Devine and her colleagues have also shown interventions that emphasised gender bias led to increased hiring of female faculty in science, technology, engineering, mathematics and medicine departments at the University of Wisconsin-Madison. While the percentage of females who were hired in the departments that did not receive the anti-bias training stayed around 32%, departments that received the training saw the proportion of women hired in the subsequent two years jump to 47%.

Black faces looked more criminal to police officers – the more black, the more criminal

But can this sort of training be effective in the highly charged and confrontational environment of law enforcement and criminal justice?

A series of experiments at Stanford University found that priming 61 police officers from an unidentified urban department in the US with words related to crime like “violent”, “crime”, “stop”, “investigate”, “arrest” and “shoot” increased their visual attention toward black faces over white faces

The police officers were also asked to look at pictures of black and white faces who, they were told, might be criminals (they were really pictures of Stanford University employees). They were asked to judge which ones “looked criminal”. Not only did the officers judge more black faces as criminal, but faces that had been rated as more “stereotypically black” – which included having darker skin – by another group of study participants, were considered more criminal than those deemed to be “less black”. “Black faces looked more criminal to police officers – the more black, the more criminal,” the authors write.

The findings are only those of a relatively small sample of police officers from a single police department, and so care must be taken with generalising the results to all police forces.


But other research by Jennifer Eberhardt, a psychologist at Stanford University, has also found that the more “stereotypically black” a defendant is perceived to be, the more likely they are to be found guilty and sentenced to capital punishment. Prompting police officers to think of capturing, shooting or arresting also leads their eyes to settle on black faces.

How implicit racial bias factors into the decisions taken by officers when they choose to use deadly force is still being researched. Some simulations have found that police officers are more likely to mistakenly shoot unarmed black people, while others have shown that officers are more hesitant to shoot armed black suspects than armed white suspects.

Simulations, however, are different than the real world. And analyses of real-world data have found bias against minorities. One study of all 991 incidences of people being killed by police in the US in 2015, for example, found that black people were more than twice as likely as white people to be unarmed when they were killed by police. “It seem that… officers subconsciously perceived minority civilians to have been a greater threat than they were,” the authors write.

But researchers say that it would be wrong to conclude that police officers have more bias than the wider population.

There’s lots of research with police officers that shows they don’t have more bias than non-police officers do at the implicit level – Jeffrey Sherman

There are clearly some cops that are just racist, and they won’t be motivated to reduce their implicit bias,” says Sherman. “But more generally they don’t seem more prone than the rest of us to demonstrating bias in a short window of time. There’s lots of research with police officers that shows they don’t have more bias than non-police officers do at the implicit level.”

What is different with police officers is the situations they find themselves in, which require lightning-fast decisions and actions – and the force, both legal and physical, that they are equipped with.

“When you have that kind of time and response restriction, bias is going to show,” says Sherman. “If you’re working in a bookstore, the consequences [of having implicit bias] are far less consequential than if you’re a cop.”

What we know is that officers can be trained to improve their decision making in such situations through computer simulations, and some experts are hopeful that means they can improve in real life.


Lorie Fridell, a professor of criminology at the University of South Florida, also works as an implicit bias trainer for police departments across the United States, including the NYPD. In her training, she presents some of Eberhardt’s work to officers to illustrate how the majority of people have an implicit bias that associates certain races with crime. She then teaches them strategies for managing their biases.

“Something just really quick and easy, for instance, is what we call the ‘litmus test’,” she says. “Ask yourself, would I be stopping this person but for the fact that they are, fill in the blank: transgender, male, Hispanic, homeless, and so forth,” she says.

Another strategy she emphasises is for officers to slow down whenever possible and focus on collecting more data instead of relying on stereotypes.

Given  the  scarcity of research, this is not the kind of bias training or anti-racist training I would reach for in this moment – Betsy Paluck

“We also emphasise to beware of the biases of others,” she says. “We talk about ‘complicit bias’ – meaning if you detect bias in your colleague and you don't do anything about it, you're guilty of complicit bias. But we also talk about bias on the part of community members, or ‘profiling by proxy’. This is when community members call the police based on their own biases.

“But what we want to say to the officers is have your antenna up for the possibility that you're being called to this scene based on a community member’s biases and you shouldn’t just pick up their biases in your policing.”

But not everyone is convinced of the effectiveness of implicit bias training, particularly when some police departments are spending millions of dollars to contractors for such training.

“Some believe that if implicit bias training includes actions and ways to channel motivation to control implicit bias, that it can be successful,” says Betsy Paluck, a psychology professor at Princeton University, who co-authored a forthcoming meta-analysis on the effectiveness of prejudice reduction strategies.


And there are other factors that can also influence how prone to unconscious bias and stereotyping we are. Fatigue, for example, can increase the implicit association between race and crime. It can also inhibit an officer’s ability or willingness to deescalate a situation.

We found that officers who were sleep deprived had higher scores for implicit association between black Americans and weapons,” says James. “There are two potential possibilities. One is that sleep affects base implicit bias. Or, the more likely, fatigue affects their ability to monitor and block their implicit bias. We have also found that officers who are tired are either less able or less willing to de-escalate and tend to be quicker to shoot.” 

It suggests that efforts to reduce the levels of fatigue among police offices might also indirectly improve an officer’s reliance on implicit bias.

Long-term social contact – in person or online – with other groups is the best way to change your attitude about those groups

Some interventions designed to manage biased behaviour have proven helpful already. Eberhardt and her colleagues worked with the Oakland Police Department to reduce the amount of racially motivated traffic stops by adding a simple yes or no question to the top of the officer’s form: “Is this intelligence led?”

They found that in the year following the change to the form, stops of black people dropped by 43% – and far from causing the city to become more dangerous, the crime rate actually continued to drop.

But if we are determined to reduce our biases so they are more aligned with our beliefs, what else can we do beyond regulating our behaviour to improve? 

Experts suggest changing our experiences and environment will change our biases in the long run.

Fridell volunteered at a homeless shelter for seven years, which she says changed her attitudes and implicit bias toward homeless people. She shares the anecdote in her courses to emphasise the long-term commitment it takes to change biases. It’s also the logic behind community outreach initiatives like Coffee with a Cop, which allows community members to connect with police officers in non-law enforcement context and vice versa.

Lai agrees that long-term social contact – in person or online – with other groups is the best way to change your attitude about those groups. For police forces trying to reduce biases in their ranks, it is a valuable lesson.

“We find it is one of the most robust ways to reduce prejudice and discrimination, be it implicit or explicit,” Lai says. “The ideal form of contact is actual interpersonal contact in daily life.”

No comments:

Post a Comment

Comments are moderated and generally will be posted if they are on-topic.