Human resources professionals have been schooled in social and legal issues regarding discrimination since the passage of the Civil Rights Act of 1964. Prohibited employment practices are typically based on employee behaviors that violate an employer’s personnel policies, state or federal law. These types of behaviors are assumed to be conscious, meaning that the employee consciously endorses his or her behavior.
In this context, an explicit bias reflects the prejudices, stereotypes or beliefs that an employee endorses consciously. For example, an employee may explicitly believe that women should not be promoted or people of color are not good risks for a mortgage loan, yet not act on such an explicit bias because of the employer’s personnel rules, state and federal laws.
But, what if an employee is not consciously aware of his or her attitudes, prejudices, and stereotypes? For example, imagine a manager who explicitly believes that people of color are equally suited for careers in the professions. Despite this egalitarian belief, this manager might nevertheless unconsciously associate people of color with prejudicial attitudes and stereotypes. This implicit association might lead him or her to reactively, automatically, and unconsciously behave in any number of biased ways, from not trusting feedback from minority co-workers to hiring white men instead of equally qualified people of color. The science of implicit cognition suggests that we do not always have conscious, intentional control over our social perceptions, processes for forming impressions, and decision-making judgements that motivate our behaviors (Brownstein, 2017).
Uncovering the Systems at Play in Implicit Biases
Cognitive psychologists Kahneman (2011), Ariely (2008) and many others agree that the human mind consists of two systems – System 1 and System 2. System 1 operates automatically, is reactive and operates in a matter of milliseconds, with little effort and no sense of voluntary control. System 2 is deliberate and allocates resources for effortful mental activities, including complex computations. The operations of System 2 are accompanied by agency, choice, and concentration. We deliberately choose to engage System 2, whereas System 1 operates automatically and outside our conscious control.
System 1 works well with most daily activities, including driving to work, using a fork to eat a salad, and performing daily work activities. It continually guides our thoughts, attitudes and beliefs without our conscious awareness as if we are on “autopilot.” System 1 helps us to utilize our limited cognitive capacity with rapid association and categorization of overwhelming stimuli and is profoundly useful, especially in times of surprise and danger. Given this perspective, biases are necessary, useful, and practical for the human race.
Occasionally System 1 fails us. We may jump to a conclusion that we later regret, such as hitting the send command for that email that should never have been sent. Over time, the System 1 brain associates experiences by forming neural pathways that become stronger every time these associations are recognized and affirmed. When we think of a computer engineer, we may picture a white male because our brain has learned that computer engineers are white and male.
It is “pleasing” to the brain when these associations are recognized. An Asian female computer engineer who becomes part of the mix may be unsettling to the System 1 brain. This comfortable fit that associates males with computer engineers may result in an unconscious and automatic bias (Chugh, 2001).
Measuring Implicit Biases
The Implicit Associations Test (IAT) is the most often used measure of unconscious and automatic responses of attitudes, stereotypes, and prejudice. The Harvard University Project Implicit website lists fourteen forms of the IAT that include race, gender, age, disability, sexuality, and weight. The IAT measures the strength of associations between concepts (e.g., black people, gay people) and evaluations (e.g., good, bad) or stereotypes (e.g., athletic, clumsy).
For example, the Race IAT presents respondents with four pictures: white faces, black faces, pleasant or “good words,” and unpleasant or “bad words.” Respondents complete a stereotype-congruent trial, in which they sort white/good pairings from black/bad parings. Respondents then complete a “stereotype-incongruent” trial, in which they sort white/bad pairings from black/good pairings. Implicit racial bias is demonstrated by measuring the respondent’s response latency, or, in other words, the difference in time it takes to complete the stereotypecongruent and the stereotype-incongruent trial.
Research reported by Greenwald and Krieger (2006) used meta-analysis to determine the predictive validity of the IAT. The analysis of sixty-one studies concluded that measures of implicit bias and explicit bias were significantly correlated with measures of behavior. Although predictive validity was greater for explicit measures than for IAT measures, “within the critical group of studies that focused on prejudicial attitudes and stereotypes . . . the predictive validity was significantly greater for the IAT measures.” Further evidence of the utility of the IAT is provided by Jost (2009) and associates reviewed of more than 700,000 subjects on the race IAT and concluded that over 70% of white participants more easily associated black faces with negative words (2009). In contrast, 40% of black participants show pro-black implicit preference, whereas 35% of black participants show pro-white implicit preferences, and 25% show no overall preference (Project Implicit).
The Pervasiveness of Implicit Biases We know from reported research that implicit biases are pervasive. Everyone possesses them, even people with avowed commitments to impartiality such as judges, physicians, and school teachers. The implicit associations we hold do not necessarily align with our declared beliefs or even reflect positions we would explicitly endorse. It is possible to say (explicitly) “I do not see color at work,” yet (implicitly) hire white applicants over equally qualified minorities. Current research on the IAT reporting the effects of implicit bias on individuals concludes that 70% to 85% of Americans have an automatic preference for whites over blacks, younger over older, straight over gay, light skin over dark skin, and thin rather than overweight (Brownstein, 2017, Project Implicit Bias, Kirwan Institute).
Several of our most trusted institutions have been the subject of research on the effects of implicit bias. This includes our civil and criminal justice system (judges, prosecutors and public defenders, and police), health care system (physicians, nurses, and other sub-specialties), human service agencies (intake personnel, program services for mentally ill, children, families, the elderly and disabled), and public and higher education (public school teachers, university faculty, administrators and students).
What We Can Do Regarding Implicit Biases
Given the overwhelming evidence that our employees hold unconscious biases, we also know that these biases are malleable and subject to change (Rudman, 2001). By simply realizing and accepting that we all have bias, employers can be vigilant and enable others to recognize the impact on employee selection and talent management. Training is the most often cited response for reducing or eliminating these biases, yet traditional approaches to training are not sufficient. Given that there is no easy formula for resolving the challenges of unconscious bias, what might HR professionals do to reduce the impact of unconscious bias?
Here are some practical recommendations:
1. Readiness to make decisions. Recognize the conditions under which unconscious biases are most likely to impact the readiness of employees to make effective decisions. Workplaces are rift with challenges of time to work harder and faster resulting in unhealthy stress. Changes in the economics of a business environment and technology, and increased participation in social media increases the likelihood of distractions. The cumulative effect of these factors results in physical, mental and emotional fatigue, all of which deplete an employee’s capacity to make bias free decisions. According to Sol and colleagues (2016), we can improve our decision making by simply avoiding making decisions when we feel depleted or fatigued. So, perhaps schedule that applicant interview first thing in the morning (after your cup of coffee).
2. Interventions that focus on cognitive strategies. One of the simplest ways is to “think of the opposite” and to “look at it differently,” especially when selecting and promoting personnel. Ask if the applicant of choice mirrors the hiring manager in terms of race, gender, age, or disability. A “similar to me” response may indicate an unconscious bias to select applicants more like the manager and less like those who represent a more diverse applicant pool, even if the diverse pool of applicants has equal or superior qualifications, experience, and skills. Another strategy is to temper optimism, the tendency to overestimate chances of success by focusing on evidence. Managers can improve “judgmental accuracy” by harnessing the “wisdom of crowds” to improve decision quality. Put simply, include others in the decision. If others are unavailable to provide input, a manager may invoke his or her “wisdom within” by inserting a time delay between decision alternatives, especially if upset, tired, hungry, or under an externally imposed time pressure. Another tactic is to assume the first decision is flawed and introduce a second decision using a different decision method (i.e., a System 1 reactive guess followed by a more deliberate System 2 response, Sol et al., 2016). Even the simplest linear models, probabilities, algorithms, and checklists can be more effective than expert judgment when making decisions.
3. Interventions that focus on modifying the environment. Although our organizations get a bad rap for any number of ills, Kahneman (2011) suggests that structures and systems in organizations actually create predictable consistency in certain business processes that modify employee behavior, thus reducing the potential for bias. Federal and state laws requiring a job analysis and job descriptions affect the recruitment and selection process. For example, structured oral interviews are used to ensure consistency across candidates. These structured processes “can reduce the patterns of unconscious bias that take over when people are just trusting their gut” (Ross, 2015). Further, a “choice architect” is someone who is actively engaged in designing an organizations environment. These architects create “nudges” that do not restrict choice, but make use of psychological principles to influence behavior in the direction of “goodness” (Thaler & Sunstein, 2009). Consider the implicit bias of giving more credence to the present than to the future in terms of retirement savings. Organizations can make their savings programs the “opt in default” that would require an employee to “opt out” of retirement savings. Other types of nudges that encourage System 2 deliberation include planning prompts that increase reflection and reduce procrastination, planned interruptions that encourage reflection on important decisions, and requiring an “active choice” between multiple options instead of mindlessly avoiding a choice (Sol et al., 2016).
Conclusion
Our HR professional community is continually challenged to incorporate emerging theories and practices into our organizations. Reducing the impact of implicit bias in our HR practices is a worthy goal. Drawing from social cognition and research, we can incorporate strategies to reduce the likelihood of implicit biases affecting our decisions in our daily lives at work.
References
Airley, D. (2008). Predictably irrational. New York: Harper Perinneal.
Brownstein, M. (2017). Implicit Bias. The Stanford Encyclopedia of Philosophy (Spring 2017 Edition), Edward N. Zalta (ed.), URL = <https:// plato.stanford.edu/archives/spr2017/entries/implicit-bias/>.
Chugh, D. (2004). Societal and managerial implications of implicit social cognition: Why milliseconds matter. Social Justice Review, 17, 2, 203-221.
Greenwald, A. G. & Krieger, L. H. (2006). Implicit bias: scientific foundations. California Law Review, 945-967.94, 4,
Project Implicit, Harvard University. https://implicit.harvard.edu/ implicit/takeatest.html
Jost, J. T., Rudman L. A., Blair, I. V., Carney, D. R., Dasgupta, N., Glaser, J., & Hardin, C. D. (2009). The existence of implicit bias is beyond a reasonable doubt: A refutation of ideological and methodological objections and executive summary of ten studies that no manager should ignore. Research in Organizational Behavior, 29, 39-69.
Kahneman, D. (2011). Thinking fast and slow. New York: Farrar, Straus and Giroux.
Kirwan Institute for the Study of Race and Ethnicity, Ohio State University. http://kirwaninstitute.osu.edu/
Ross, H. (2015, April 15). 3 ways to make less biased decisions? Harvard Business Review.
Rudman, L. (2001). “Unlearning” automatic biases: The malleability of implicit prejudice and stereotypes. Journal of Personality and Social Psychology, 8, 5, 856-868.
Soll, J. B., Milkman, K. L., Payne, J. W. (2016). A users guide to debiasing. In G. Keren & G. Wu, (Eds.). The Wiley Blackwell Handbook of Judgment and Decision Making. London: Wiley-Blackwell Publishers.
Thaler, R. H. & Sustein, C. R. (2009). Nudge. New York: Penguin Books.