Implicit Bias
Our biases and assumptions about others can be so automatic that they result in unintended thoughts that contradict our own beliefs. Even given our best intentions, we all hold some form of bias due to socialization and cultural stereotypes. Our implicit biases are essentially bad habits that stem from cultural learnings—they are a byproduct of our socialization and not a moral failing. If we are not aware of our biases, those habits can become activated and applied by default even when they may be undesirable and counteract our intentions.
The good news is that, like all bad habits, it is possible to break this bad habit of implicit bias, though it will take time and conscious attention. What differentiates those with lower prejudice is their unwillingness to apply stereotypes to a whole group. During the change process, an individual must “not only inhibit automatically activated information but also intentionally replace such activation with nonprejudiced ideas and responses” (Devine, 1998).
It can be difficult to correct our implicit biases because our assumptions often go unnoticed in everyday life. We don’t often receive feedback that confirms or dispels the assumptions we make about others. Our biases continue to live in our minds unless we unearth and intentionally confront them, asking how we know our assumptions are true.
Why is it important to be aware of implicit bias?
Regardless of how well-intentioned we are as instructors, implicit biases result from automatic thoughts. These can end up negatively impacting students and depriving them of opportunities and learning experiences.
In a study conducted by Moss-Racusin et al. in 2012, faculty members received the same resume and application materials in consideration for a laboratory manager position, with a random assignment of a male or female student name. The faculty members more frequently judged the female student to be less competent and less hireable and offered her a smaller starting salary and less career mentoring than the male student. This bias was independent of the faculty member’s gender, scientific discipline, age, and tenure status, which suggested that the implicit bias towards the female student was “likely unintentional, generated from widespread cultural stereotypes rather than a conscious intention to harm women.”
Interestingly, the faculty members actually reported liking the female student more than the male student. However, this did not translate into similarly positive perceptions of her competence. Faculty members of both genders seemed to be affected by cultural stereotypes about women’s lack of competence in science, despite not intending to dislike the female candidate. This shows the potential negative impacts of implicit bias. Despite good intentions, the continuation of such biases towards any group can have detrimental effects.
In a similar 2019 study by Eaton et al., they experimentally manipulated the gender and racial identities of CVs for postdoctoral researcher applications that STEM professors in biology and physics would then review. In line with the Moss-Racusin 2012 study, they found a gender bias, in which the physics professors favored the male students, and a racial bias, in which the physics professors perceived Asian and White students as more competent than Black and Latinx candidates. The biology faculty did not exhibit a gender bias, which Eaton et al. theorized might be because biology is a more gender-balanced field than physics. Biology faculty exhibited a racial bias of Asian students being seen as more competent than Black students. This study also found compounded racial and gender biases, where Black and Latina female candidates, as well as Latino male candidates, were rated as less hireable than other candidates.
Eaton et al. noted less bias in evaluating applicants with exceptionally strong records or clear differences in quality. Implicit bias is more likely to play a role in deciding between moderately and equally qualified candidates. The results of these studies exemplify the impact that implicit biases can have if left unchecked, as the faculty members in both studies did not consciously intend to be biased against these groups.
How to mitigate bias in the classroom
We can hypothesize how issues similar to those above can arise in the classroom when evaluating student performance on more subjective tasks (e.g., awarding points for class discussion, open-ended writing assignments or projects) or when assigning student grades at the end of the term when a student is close to a letter grade threshold (A/B, B/C, etc.). Additional areas where implicit bias can show up in the classroom include group work and resource allocation, such as opportunities (e.g., undergraduate research) and your time. To help mitigate the impact of implicit bias in the classroom:
- Pay attention to who you mentor and who participates in class. This can show up in class discussions, where our biases can lead us to (unintentionally) respond differently to student comments or call on certain students more than others. Another place our biases can impact the classroom is with participation. Our memory and biases may provide us with false accounts about which students participated the most or least unless there is an objective way to measure who is participating.
- Set criteria in advance. Create rubrics to help reduce bias during grading and share the rubrics with students when the assignments are given. Because you can make grading decisions based on those predetermined criteria, grading will likely be more objective. It provides a clear way for you and the students to be on the same page about what they did well and what they need to improve on an assignment.
- Structure time for making important decisions. It is difficult to be vigilant about bias when you are stressed or tired. Ensure you are well rested before grading exams or giving feedback to students. We also recommend taking a break in between grading multiple assignments. This will help you resist the temptation to make quick decisions, which is especially important for making more objective decisions that will affect others.
Ineffective strategies for reducing bias
- Stereotype suppression . Stereotype suppression involves trying to suppress a stereotype whenever it comes to mind. This strategy is not as effective as it seems because the more you suppress a thought, the more you will think about it. As a result, you may actually be more likely to view others through stereotypes because you are constantly trying to suppress such thoughts.
- Colorblindness . This strategy is the idea of ignoring aspects of another person such as race or gender. Though the intention — to treat someone “normally” — may be good, this is not very effective. Besides not being physically possible to do so, many people also derive pride from such aspects of their background. Additionally, the more you think about treating someone “normally”, the more you actually act differently towards them. This is similar to the idea behind stereotype suppression, where the more you think about trying to act a certain way, the less effective it turns out to be.
Research-based strategies to reduce bias
Breaking your bias habits is something that has to be practiced over time. Effective, research-based bias reduction strategies include:
- Stereotype replacement . While refraining from judgment, be attentive to patterns manifesting in your thinking. When you encounter an assumption, pause and ask yourself: “How do I know that about the person? Is it from a stereotype I have internalized, or do I have evidence from something actually happening?” By consciously surfacing and questioning your assumptions about others, you are intentionally replacing stereotypes with the individuating information unique to each person.
- Perspective-taking . Consider situational explanations. We tend to assume that an individual’s personal qualities or ability cause their behavior and be less attentive to the aspects of a situation that may have actually caused the person’s behavior. For instance, if a student does poorly on a test, it may not necessarily be because they are not smart or capable. There are many reasons why they may have performed poorly, whether it was lack of sleep, illness, personal problems, or too many exams within a week (to name a few). If you catch yourself attributing situational results to an individual’s internal characteristics, check your assumptions. It is important to think about how you actually know this and consider the possibility of a situational explanation.
- Commit to criteria . Before evaluating applicants or grading assignments, as mentioned in the previous section, it is helpful to have the same predetermined criteria and credentials against which to evaluate students or applicants. Research has found that bias is substantially less prominent when evaluators commit to criteria in advance of doing the evaluation. This allows evaluators to hold each other accountable and creates less room for in-the-moment decisions, creating opportunities for implicit bias and automatic thoughts we don’t intend.
- Modify the environment. Evaluate what messages are in the environment about who belongs or succeeds, and seek to increase the representation of underrepresented groups. Increasing opportunities for genuine interaction with members of other groups, whether in the classroom or outside of it, can broaden perspectives and recognize individuating information about people.
- Speak up against bias. “Authority” figures can hold a lot of power and act as effective allies. However, non-authority figures can also help speak up against bias and encourage the community to act supportively. For example, a female colleague proposes an idea at a meeting that is later attributed to a male colleague who repeats it. To gently help the meeting attendees realize the misattribution, you could affirm, “Right, as Mary proposed earlier, I think that’s a great idea” or otherwise point out the similarities between the ideas. Another key consideration in speaking up against bias is tone of voice, conveying your intent to understand or clarify rather than ridicule or accuse, which could prompt defensiveness. Additionally, using concrete instances rather than abstract accusations will allow others to be more amenable to the explanations and solutions you offer.
To learn more about research-based strategies to reduce bias, check out the Breaking the Bias Habit learning bundle on Atlas (MIT Touchstone authentication required).
Devine, P. G. (1989). Stereotypes and prejudice: Their automatic and controlled components. Journal of Personality and Social Psychology, 56 (1), 5-18.
Devine, P. G., Forscher, P. S., Austin, A. J., & Cox, W. T. (2012). Long-term reduction in implicit race bias: A prejudice habit-breaking intervention. Journal of Experimental Social Psychology, 48 (6), 1267-1278.
Eaton, A. A., Saunders, J. F., Jacobson, R. K., & West, K. (2019). How Gender and Race Stereotypes Impact the Advancement of Scholars in STEM: Professors’ Biased Evaluations of Physics and Biology Post-Doctoral Candidates. Sex Roles, 82 (3-4), 127-141.
Moss-Racusin, C. A., Dovidio, J. F., Brescoll, V. L., Graham, M. J., & Handelsman, J. (2012). Science faculty’s subtle gender biases favor male students. Proceedings of the National Academy of Sciences, 109 (41), 16474-16479.
Shaw, Y. & Natisse, K. M. (Host). (2017, June 15). The Culture Inside [Audio podcast episode]. In Invisibilia . NPR. https://www.npr.org/programs/invisibilia/532950995/the-culture-inside.
MIT Teaching and Learning Lab. (2019, Oct 9). Drs. Devine & Cox: Empowering People to Break the Bias Habit [Video]. Available to the MIT community in the Atlas Learning Center: atlas.mit.edu.
Taking steps to recognize and correct unconscious assumptions toward groups can promote health equity.
JENNIFER EDGOOSE, MD, MPH, MICHELLE QUIOGUE, MD, FAAFP, AND KARTIK SIDHAR, MD
Fam Pract Manag. 2019;26(4):29-33
Author disclosures: no relevant financial affiliations disclosed.
Jamie is a 38-year-old woman and the attending physician on a busy inpatient teaching service. On rounds, she notices several patients tending to look at the male medical student when asking a question and seeming to disregard her. Alex is a 55-year-old black man who has a history of diabetic polyneuropathy with significant neuropathic pain. His last A1C was 7.8. He reports worsening lower extremity pain and is frustrated that, despite his bringing this up repeatedly to different clinicians, no one has addressed it. Alex has been on gabapentin 100 mg before bed for 18 months without change, and his physicians haven't increased or changed his medication to help with pain relief.
Alisha is a 27-year-old Asian family medicine resident who overhears labor and delivery nurses and the attending complain that Indian women are resistant to cervical exams.
These scenarios reflect the unconscious assumptions that pervade our everyday lives, not only as practicing clinicians but also as private citizens. Some of Jamie's patients assume the male member of the team is the attending physician. Alex's physicians perceive him to be a “drug-seeking” patient and miss opportunities to improve his care. Alisha is exposed to stereotypes about a particular ethnic group.
Although assumptions like these may not be directly ill-intentioned, they can have serious consequences. In medical practice, these unconscious beliefs and stereotypes influence medical decision-making. In the classic Institute of Medicine report “Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care,” the authors concluded that “bias, stereotyping, and clinical uncertainty on the part of health care providers may contribute to racial and ethnic disparities in health care” often despite providers' best intentions. 1 For example, studies show that discrimination and bias at both the individual and institutional levels contribute to shocking disparities for African-American patients in terms of receiving certain procedures less often or experiencing much higher infant mortality rates when compared with non-Hispanic whites. 2 , 3 As racial and ethnic diversity increases across our nation, it is imperative that we as physicians intentionally confront and find ways to mitigate our biases.
Implicit bias is the unconscious collection of stereotypes and attitudes that we develop toward certain groups of people, which can affect our patient relationships and care decisions.
You can overcome implicit bias by first discovering your blind spots and then actively working to dismiss stereotypes and attitudes that affect your interactions.
While individual action is helpful, organizations and institutions must also work to eliminate systemic problems.
DEFINING AND REDUCING IMPLICIT BIAS
For the last 30 years, science has demonstrated that automatic cognitive processes shape human behavior, beliefs, and attitudes. Implicit or unconscious bias derives from our ability to rapidly find patterns in small bits of information. Some of these patterns emerge from positive or negative attitudes and stereotypes that we develop about certain groups of people and form outside our own consciousness from a very young age. Although such cognitive processes help us efficiently sort and filter our perceptions, these reflexive biases also promote inconsistent decision making and, at worst, systematic errors in judgment.
Cognitive processes lead us to associate unconscious attributes with social identities. The literature explores how this influences our views on race, ethnicity, age, gender, sexual orientation, and weight, and studies show many people are biased in favor of people who are white, young, male, heterosexual, and thin. 4 Unconsciously, we not only learn to associate certain attributes with certain social groupings (e.g., men with strength, women with nurturing) but also develop preferential ranking of such groups (e.g., preference for whites over blacks). This unconscious grouping and ranking takes root early in development and is shaped by many outside factors such as media messages, institutional policies, and family beliefs. Studies show that health care professionals have the same level of implicit bias as the general population and that higher levels are associated with lower quality care. 5 Providers with higher levels of bias are more likely to demonstrate unequal treatment recommendations, disparities in pain management, and even lack of empathy toward minority patients. 6 In addition, stressful, time-pressured, and overloaded clinical practices can actually exacerbate unconscious negative attitudes. Although the potential impact of our biases can feel overwhelming, research demonstrates that these biases are malleable and can be overcome by conscious mitigation strategies. 7
We recommend three overarching strategies to mitigate implicit bias – educate, expose, and approach – which we will discuss in greater detail. We have further broken down these strategies into eight evidence-based tactics you can incorporate into any quality improvement project, diagnostic dilemma, or new patient encounter. Together, these eight tactics spell out the mnemonic IMPLICIT. (See “ Strategies to combat our implicit biases .”)
Explore and identify your own implicit biases by taking implicit association tests or through other means. | ||
Practice ways to reduce stress and increase mindfulness, such as meditation, yoga, or focused breathing. | “ ” | |
Consider experiences from the point of view of the person being stereotyped. This can involve consuming media about those experiences, such as books or videos, and directly interacting with people from that group. | “ ” | |
Pause and reflect on your potential biases before interacting with people of certain groups to reduce reflexive reactions. This could include thinking about positive examples of that stereotyped group, such as celebrities or personal friends. | “ ” | |
Evaluate people based on their personal characteristics rather than those affiliated with their group. This could include connecting over shared interests or backgrounds. | “ ” | |
Embrace evidence-based statements that reduce implicit bias, such as welcoming and embracing multiculturalism. | “ ” | |
Promote procedural change at the organizational level that moves toward a socially accountable health care system with the goal of health equity. | ||
Practice cultural humility, a lifelong process of critical self-reflection to readdress the power imbalances of the clinician-patient relationship. | “ ” |
When we fail to learn about our blind spots, we miss opportunities to avoid harm. Educating ourselves about the reflexive cognitive processes that unconsciously affect our clinical decisions is the first step. The following tactics can help:
Introspection . It is not enough to just acknowledge that implicit bias exists. As clinicians, we must directly confront and explore our own personal implicit biases. As the writer Anais Nin is often credited with saying, “We don't see things as they are, we see them as we are.” To shed light on your potential blind spots and unconscious “sorting protocols,” we encourage you to take one or more implicit association tests . Discovering a moderate to strong bias in favor of or against certain social identities can help you begin this critical step in self exploration and understanding. 8 You can also complete this activity with your clinic staff and fellow physicians to uncover implicit biases as a group and set the stage for addressing them. For instance, many of us may be surprised to learn after taking an implicit association test that we follow the typical bias of associating males with science — an awareness that may explain why the patient in our first case example addressed questions to the male medical student instead of the female attending.
Mindfulness .It should come as no surprise that we are more likely to use cognitive shortcuts inappropriately when we are under pressure. Evidence suggests that increasing mindfulness improves our coping ability and modifies biological reactions that influence attention, emotional regulation, and habit formation. 9 There are many ways to increase mindfulness, including meditation, yoga, or listening to inspirational texts. In one study, individuals who listened to a 10-minute meditative audiotape that focused them and made them more aware of their sensations and thoughts in a nonjudgmental way caused them to rely less on instinct and show less implicit bias against black people and the aged. 10
It is also helpful to expose ourselves to counter-stereotypes and to focus on the unique individuals we interact with. Similarity bias is the tendency to favor ourselves and those like us. When our brains label someone as being within our same group, we empathize better and use our actions, words, and body language to signal this relatedness. Experience bias can lead us to overestimate how much others see things the same way we do, to believe that we are less vulnerable to bias than others, and to assume that our intentions are clear and obvious to others. Gaining exposure to other groups and ways of thinking can mitigate both of these types of bias. The following tactics can help:
Perspective-taking . This tactic involves taking the first-person perspective of a member of a stereotyped group, which can increase psychological closeness to that group. 8 Reading novels, watching documentaries, and listening to podcasts are accessible ways to reach beyond our comfort zone. To authentically perceive another person's perspective, however, you should engage in positive interactions with stereotyped group members in real life. Increased face-to-face contact with people who seem different from you on the surface undermines implicit bias.
Learn to slow down . To recognize our reflexive biases, we must pause and think. For example, the next time you interact with someone in a stereotyped group or observe societal stereotyping, such as through the media, recognize what responses are based on stereotypes, label those responses as stereotypical, and reflect on why the responses occurred. You might then consider how the biased response could be avoided in the future and replace it with an unbiased response. The physician treating Alex in the introduction could use this technique by slowing down and reassessing his medical care. By acknowledging the potential for bias, the physician may recognize that safe options remain for managing Alex's neuropathic pain.
Additionally, research strongly supports the use of counter-stereotypic imaging to replace automatic responses. 11 For example, when seeking to contradict a prevailing stereotype, substitute highly defined images, which can be abstract (e.g., modern Native Americans), famous (e.g., minority celebrities like Oprah Winfrey or Lin-Manuel Miranda), or personal (e.g., your child's teacher). As positive exemplars become more salient in your mind, they become cognitively accessible and challenge your stereotypic biases.
Individuation . This tactic relies on gathering specific information about the person interacting with you to prevent group-based stereotypic inferences. Family physicians are trained to build and maintain relationships with each individual patient under their care. Our own social identities intersect with multiple social groupings, for example, related to sexual orientation, ethnicity, and gender. Within these multiplicities, we can find shared identities that bring us closer to people, including shared experiences (e.g., parenting), common interests (e.g., sports teams), or mutual purpose (e.g., surviving cancer). Individuation could have helped the health care workers in Alisha's labor and delivery unit to avoid making judgments based on stereotypes. We can use this tactic to help inform clinical decisions by using what we know about a person's specific, individual, and unique attributes. 11
Like any habit, it is difficult to change biased behaviors with a “one shot” educational approach or awareness campaign. Taking a systematic approach at both the individual and institutional levels, and incorporating a continuous process of improvement, practice, and reflection, is critical to improving health equity.
Check your messaging . Using very specific messages designed to create a more inclusive environment and mitigate implicit bias can make a real difference. As opposed to claiming “we don't see color” or using other colorblind messaging, statements that welcome and embrace multiculturalism can have more success at decreasing racial bias.
Institutionalize fairness . Organizations have a responsibility to support a culture of diversity and inclusion because individual action is not enough to deconstruct systemic inequities. To overcome implicit bias throughout an organization, consider implementing an equity lens – a checklist that helps you consider your blind spots and biases and assures that great ideas and interventions are not only effective but also equitable (an example is included in the table above ). Another example would be to find opportunities to display images in your clinic's waiting room that counter stereotypes. You could also survey your institution to make sure it is embracing multicultural (and not colorblind) messaging.
Take two . Resisting implicit bias is lifelong work. The strategies introduced here require constant revision and reflection as you work toward cultural humility. Examining your own assumptions is just a starting point. Talking about implicit bias can trigger conflict, doubt, fear, and defensiveness. It can feel threatening to acknowledge that you participate in and benefit from systems that work better for some than others. This kind of work can mean taking a close look at the relationships you have and the institutions of which you are a part.
MOVING FORWARD
Education, exposure, and a systematic approach to understanding implicit bias may bring us closer to our aspirational goal to care for all our patients in the best possible way and move us toward a path of achieving health equity throughout the communities we serve. The mnemonic IMPLICIT can help us to remember the eight tactics we all need to practice. While disparities in social determinants of health are often beyond the control of an individual physician, we can still lead the fight for health equity for our own patients, both from within and outside the walls of health care. With our specialty-defining goal of getting to know each patient as a unique individual in the context of his or her community, family physicians are well suited to lead inclusively by being humble, respecting the dignity of each person, and expressing appreciation for how hard everyone works to overcome bias.
Smedley BD, Stith AY, Nelson AR, eds Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care . Washington, DC: Institute of Medicine, National Academy Press; 2003.
Hannan EL, van Ryn M, Burke J, et al.; Access to coronary artery bypass surgery by race/ethnicity and gender among patients who are appropriate for surgery. Med Care . 1999;37(1):68-77.
Infant mortality and African Americans. U.S Department of Health and Human Services Office of Minority Health website. https://minorityhealth.hhs.gov/omh/browse.aspx?lvl=4&lvlid=23 . Updated Nov. 9, 2017. Accessed June 10, 2019.
Nosek BA, Smyth FL, Hansen JJ, et al.; Pervasiveness and correlates of implicit attitudes and stereotypes. Eur Rev Soc Psychol . 2007;18(1):36-88.
FitzGerald C, Hurst S. Implicit bias in healthcare professionals: a systematic review. BMC Med Ethics . 2017;18(1):19.
Maina IW, Belton TD, Ginzberg S, Singh A, Johnson TJ. A decade of studying implicit racial/ethnic bias in healthcare providers using the implicit association test. Soc Sci Med . 2018;199:219-229.
Charlesworth TES, Banaji MR. Patterns of implicit and explicit attitudes: I. long-term change and stability from 2007 to 2016. Psychol Sci . 2019;30(2):174-192.
Sukhera J, Wodzinski M, Teunissen PW, Lingard L, Watling C. Striving while accepting: exploring the relationship between identity and implicit bias recognition and management. Acad Med . 2018;93(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 57th Annual Research in Medical Education Sessions):S82-S88.
Burgess DJ, Beach MC, Saha S. Mindfulness practice: A promising approach to reducing the effects of clinician implicit bias on patients. Patient Educ Couns . 2017;100(2):372-376.
Lueke A, Gibson B. Mindfulness meditation reduces implicit age and race bias: the role of reduced automaticity of responding. Soc Psychol Personal Sci . 2015;6(3):284-291.
Devine PG, Forscher PS, Austin AJ, Cox WTL. Long-term reduction in implicit race bias: a prejudice habit-breaking intervention. J Exp Soc Psychol . 2012;48(6):1267-1278.
Continue Reading
More in FPM
More in pubmed.
Copyright © 2019 by the American Academy of Family Physicians.
This content is owned by the AAFP. A person viewing it online may make one printout of the material and may use that printout only for his or her personal, non-commercial reference. This material may not otherwise be downloaded, copied, printed, stored, transmitted or reproduced in any medium, whether now known or later invented, except as authorized in writing by the AAFP. See permissions for copyright questions and/or permission requests.
Copyright © 2024 American Academy of Family Physicians. All Rights Reserved.
Please log in to save materials. Log in
- Resource Library
Common Course Cartridge
I’m not biased, am i understanding implicit bias.
Bias is a universal human condition. It is not a personal defect, but it is important to recognize your biases and manage them. We cannot cure unconscious bias, but we can address it. This lesson will provide you the opportunity to identify your personal biases. You have them, even if you think you don’t! You are encouraged to try this lesson so you can be more aware of your personal biases and take the necessary steps to reduce their impact on your life.
CC.8.5.11-12.G Integrate and evaluate multiple sources of information presented in diverse formats and media (e.g., visually, quantitatively, as well as in words) in order to address a question or solve a problem.
Introductory warm-up activity.
Click on the link below to participate in a brief study conducted by Harvard University. You will be asked to agree to the terms. Select a test that you are interested in completing. You can choose between approximately 20 tests (gender, sexual orientation, age, weight, etc.). The purpose of the Implicit Association Test is for you to discover if you have an automatic preference for a particular group over another.
Explore these resources to learn about the implicit bias. You can pick and choose to read, watch, then do the activity listed.
Read the article to gain a better of understanding of unconscious bias and how you can make an effort to prevent your biases from affecting your decisions. For a more detailed look at the types of cognitive biases, read . For a more simplified chart of the types of cognitive bias, take a look at . |
Watch this brief video ( ) to gain a better understanding of unconscious bias and how you can make an effort to prevent your biases from affecting your decisions. Google created a short video titled, , to provide insight on the importance of recognizing your personal biases and limiting their effects. For a more detailed look at the types of cognitive biases, watch . |
Head over to to practice flashcards, matching, and other activities to help enhance your understanding of the types of cognitive biases. |
Discuss your ideas / opinions / understandings.
Implicit (or unconscious) bias refers to bias that we are unaware of, that just sort of happens. It happens automatically when our brain makes quick judgments of people based on our cultural background and experiences.
Directions:
Take the Snap Judgment quiz to discover your unconscious bias.
Discussion questions:
Which of the people in the Snap Judgment quiz do you think you were quickest to judge? What made you so quick to judge?
Now it is time to self check how much you have learned about bias. If you do not know as much as you thought, go back to the “Explore” section of this seminar and reread, rewatch, or redo the activities listed. See your facilitator if you have questions.
Click here to take the quiz online. You do not have to log into the quiz site in order to take this quiz. If a window pops up asking you to sign up for the quiz site, just close the sign-up window and start your quiz.
This is a task or project where you can show what you know.
As human beings, we tend to share common cultural traits with the people we trust most. Complete the Trusted 10 activity to outline the ten people that you trust most. Try to avoid listing family members. If you can’t think of ten people that you trust, simply list as many as you can. Before you type on the template, be sure to make a copy of the document. Only type on your copy. When you have finished the activity and responded to the questions, submit your answers for review. Your responses to the Trusted 10 activity will be scored using this rubric .
Complete this wrap-up activity where you reflect on your learning.
Did you realize that you were guilty of being biased before this lesson? Have you ever felt like a victim of bias? How can you be an ally to students at school or families in the community who experience bias?
Version History
Implicit Bias (Unconscious Bias): Definition & Examples
Charlotte Ruhl
Research Assistant & Psychology Graduate
BA (Hons) Psychology, Harvard University
Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.
Learn about our Editorial Process
Saul McLeod, PhD
Editor-in-Chief for Simply Psychology
BSc (Hons) Psychology, MRes, PhD, University of Manchester
Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.
On This Page:
Implicit bias refers to the beliefs and attitudes that affect our understanding, actions and decisions in an unconscious way.
Take-home Messages
- Implicit biases are unconscious attitudes and stereotypes that can manifest in the criminal justice system, workplace, school setting, and in the healthcare system.
- Implicit bias is also known as unconscious bias or implicit social cognition.
- There are many different examples of implicit biases, ranging from categories of race, gender, and sexuality.
- These biases often arise from trying to find patterns and navigate the overwhelming stimuli in this complicated world. Culture, media, and upbringing can also contribute to the development of such biases.
- Removing these biases is a challenge, especially because we often don’t even know they exist, but research reveals potential interventions and provides hope that levels of implicit biases in the United States are decreasing.
The term implicit bias was first coined in 1995 by psychologists Mahzarin Banaji and Anthony Greenwald, who argued that social behavior is largely influenced by unconscious associations and judgments (Greenwald & Banaji, 1995).
So, what is implicit bias?
Specifically, implicit bias refers to attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious way, making them difficult to control.
Since the mid-90s, psychologists have extensively researched implicit biases, revealing that, without even knowing it, we all possess our own implicit biases.
System 1 and System 2 Thinking
Kahneman (2011) distinguishes between two types of thinking: system 1 and system 2.
- System 1 is the brain’s fast, emotional, unconscious thinking mode. This type of thinking requires little effort, but it is often error-prone. Most everyday activities (like driving, talking, cleaning, etc.) heavily use the type 1 system.
- System 2 is slow, logical, effortful, conscious thought, where reason dominates.
Implicit Bias vs. Explicit Bias
Implicit Bias | Explicit Bias | |
---|---|---|
Unconscious attitudes or stereotypes that affect our understanding, actions, and decisions. | Conscious beliefs and attitudes about a person or group. | |
Can influence decisions and behavior subconsciously. | Usually apparent in a person’s language and behavior. | |
A hiring manager unknowingly favors candidates who went to the same university as them. | A person making a conscious decision not to hire someone based on their ethnicity. | |
Can lead to unintentional discrimination and bias in many areas like hiring, law enforcement, healthcare, etc. | A person making a conscious decision not to hire someone based on ethnicity. | |
Measured using implicit association tests and other indirect methods. | Can be assessed directly through surveys, interviews, etc. | |
Very common, as everyone holds unconscious biases to some degree. | Less common, as societal norms have shifted to view explicit bias as unacceptable. | |
Improve self-awareness, undergo bias training, diversify your experiences and interactions. | Education, awareness, promoting inclusivity and diversity. |
What is meant by implicit bias?
Implicit bias (unconscious bias) refers to attitudes and beliefs outside our conscious awareness and control. Implicit biases are an example of system one thinking, so we are unaware they exist (Greenwald & Krieger, 2006).
An implicit bias may counter a person’s conscious beliefs without realizing it. For example, it is possible to express explicit liking of a certain social group or approval of a certain action while simultaneously being biased against that group or action on an unconscious level.
Therefore, implicit and explicit biases might differ for the same person.
It is important to understand that implicit biases can become explicit biases. This occurs when you become consciously aware of your prejudices and beliefs. They surface in your mind, leading you to choose whether to act on or against them.
What is meant by explicit bias?
Explicit biases are biases we are aware of on a conscious level (for example, feeling threatened by another group and delivering hate speech as a result). They are an example of system 2 thinking.
It is also possible that your implicit and explicit biases differ from your neighbor, friend, or family member. Many factors can control how such biases are developed.
What Are the Implications of Unconscious Bias?
Implicit biases become evident in many different domains of society. On an interpersonal level, they can manifest in simply daily interactions.
This occurs when certain actions (or microaggressions) make others feel uncomfortable or aware of the specific prejudices you may hold against them.
Implicit Prejudice
Implicit prejudice is the automatic, unconscious attitudes or stereotypes that influence our understanding, actions, and decisions. Unlike explicit prejudice, which is consciously controlled, implicit prejudice can occur even in individuals who consciously reject prejudice and strive for impartiality.
Unconscious racial stereotypes are a major example of implicit prejudice. In other words, having an automatic preference for one race over another without being aware of this bias.
This bias can manifest in small interpersonal interactions and has broader implications in society’s legal system and many other important sectors.
Examples may include holding an implicit stereotype that associates Black individuals as violent. As a result, you may cross the street at night when you see a Black man walking in your direction without even realizing why you are crossing the street.
The action taken here is an example of a microaggression. A microaggression is a subtle, automatic, and often nonverbal that communicates hostile, derogatory, or negative prejudicial slights and insults toward any group (Pierce, 1970). Crossing the street communicates an implicit prejudice, even though you might not even be aware.
Another example of an implicit racial bias is if a Latino student is complimented by a teacher for speaking perfect English, but he is a native English speaker. Here, the teacher assumed that English would not be his first language simply because he is Latino.
Gender Stereotypes
Gender biases are another common form of implicit bias. Gender biases are the ways in which we judge men and women based on traditional feminine and masculine assigned traits.
For example, a greater assignment of fame to male than female names (Banaji & Greenwald, 1995) reveals a subconscious bias that holds men at a higher level than their female counterparts. Whether you voice the opinion that men are more famous than women is independent of this implicit gender bias.
Another common implicit gender bias regards women in STEM (science, technology, engineering, and mathematics).
In school, girls are more likely to be associated with language over math. In contrast, males are more likely to be associated with math over language (Steffens & Jelenec, 2011), revealing clear gender-related implicit biases that can ultimately go so far as to dictate future career paths.
Even if you outwardly say men and women are equally good at math, it is possible you subconsciously associate math more strongly with men without even being aware of this association.
Health Care
Healthcare is another setting where implicit biases are very present. Racial and ethnic minorities and women are subject to less accurate diagnoses, curtailed treatment options, less pain management, and worse clinical outcomes (Chapman, Kaatz, & Carnes, 2013).
Additionally, Black children are often not treated as children or given the same compassion or level of care provided for White children (Johnson et al., 2017).
It becomes evident that implicit biases infiltrate the most common sectors of society, making it all the more important to question how we can remove these biases.
LGBTQ+ Community Bias
Similar to implicit racial and gender biases, individuals may hold implicit biases against members of the LGBTQ+ community. Again, that does not necessarily mean that these opinions are voiced outwardly or even consciously recognized by the beholder, for that matter.
Rather, these biases are unconscious. A really simple example could be asking a female friend if she has a boyfriend, assuming her sexuality and that heterosexuality is the norm or default.
Instead, you could ask your friend if she is seeing someone in this specific situation. Several other forms of implicit biases fall into categories ranging from weight to ethnicity to ability that come into play in our everyday lives.
Legal System
Both law enforcement and the legal system shed light on implicit biases. An example of implicit bias functioning in law enforcement is the shooter bias – the tendency among the police to shoot Black civilians more often than White civilians, even when they are unarmed (Mekawi & Bresin, 2015).
This bias has been repeatedly tested in the laboratory setting, revealing an implicit bias against Black individuals. Blacks are also disproportionately arrested and given harsher sentences, and Black juveniles are tried as adults more often than their White peers.
Black boys are also seen as less childlike, less innocent, more culpable, more responsible for their actions, and as being more appropriate targets for police violence (Goff, 2014).
Together, these unconscious stereotypes, which are not rooted in truth, form an array of implicit biases that are extremely dangerous and utterly unjust.
Implicit biases are also visible in the workplace. One experiment that tracked the success of White and Black job applicants found that stereotypically White received 50% more callbacks than stereotypically Black names, regardless of the industry or occupation (Bertrand & Mullainathan, 2004).
This reveals another form of implicit bias: the hiring bias – Anglicized‐named applicants receiving more favorable pre‐interview impressions than other ethnic‐named applicants (Watson, Appiah, & Thornton, 2011).
We’re susceptible to bias because of these tendencies:
We tend to seek out patterns
A key reason we develop such biases is that our brains have a natural tendency to look for patterns and associations to make sense of a very complicated world.
Research shows that even before kindergarten, children already use their group membership (e.g., racial group, gender group, age group, etc.) to guide inferences about psychological and behavioral traits.
At such a young age, they have already begun seeking patterns and recognizing what distinguishes them from other groups (Baron, Dunham, Banaji, & Carey, 2014).
And not only do children recognize what sets them apart from other groups, they believe “what is similar to me is good, and what is different from me is bad” (Cameron, Alvarez, Ruble, & Fuligni, 2001).
Children aren’t just noticing how similar or dissimilar they are to others; dissimilar people are actively disliked (Aboud, 1988).
Recognizing what sets you apart from others and then forming negative opinions about those outgroups (a social group with which an individual does not identify) contributes to the development of implicit biases.
We like to take shortcuts
Another explanation is that the development of these biases is a result of the brain’s tendency to try to simplify the world.
Mental shortcuts make it faster and easier for the brain to sort through all of the overwhelming data and stimuli we are met with every second of the day. And we take mental shortcuts all the time. Rules of thumb, educated guesses, and using “common sense” are all forms of mental shortcuts.
Implicit bias is a result of taking one of these cognitive shortcuts inaccurately (Rynders, 2019). As a result, we incorrectly rely on these unconscious stereotypes to provide guidance in a very complex world.
And especially when we are under high levels of stress, we are more likely to rely on these biases than to examine all of the relevant, surrounding information (Wigboldus, Sherman, Franzese, & Knippenberg, 2004).
Social and Cultural influences
Influences from media, culture, and your individual upbringing can also contribute to the rise of implicit associations that people form about the members of social outgroups. Media has become increasingly accessible, and while that has many benefits, it can also lead to implicit biases.
The way TV portrays individuals or the language journal articles use can ingrain specific biases in our minds.
For example, they can lead us to associate Black people with criminals or females as nurses or teachers. The way you are raised can also play a huge role. One research study found that parental racial attitudes can influence children’s implicit prejudice (Sinclair, Dunn, & Lowery, 2005).
And parents are not the only figures who can influence such attitudes. Siblings, the school setting, and the culture in which you grow up can also shape your explicit beliefs and implicit biases.
Implicit Attitude Test (IAT)
What sets implicit biases apart from other forms is that they are subconscious – we don’t know if we have them.
However, researchers have developed the Implicit Association Test (IAT) tool to help reveal such biases.
The Implicit Attitude Test (IAT) is a psychological assessment to measure an individual’s unconscious biases and associations. The test measures how quickly a person associates concepts or groups (such as race or gender) with positive or negative attributes, revealing biases that may not be consciously acknowledged.
The IAT requires participants to categorize negative and positive words together with either images or words (Greenwald, McGhee, & Schwartz, 1998).
Tests are taken online and must be performed as quickly as possible, the faster you categorize certain words or faces of a category, the stronger the bias you hold about that category.
For example, the Race IAT requires participants to categorize White faces and Black faces and negative and positive words. The relative speed of association of black faces with negative words is used as an indication of the level of anti-black bias.
Professor Brian Nosek and colleagues tested more than 700,000 subjects. They found that more than 70% of White subjects more easily associated White faces with positive words and Black faces with negative words, concluding that this was evidence of implicit racial bias (Nosek, Greenwald, & Banaji, 2007).
Outside of lab testing, it is very difficult to know if we do, in fact, possess these biases. The fact that they are so hard to detect is in the very nature of this form of bias, making them very dangerous in various real-world settings.
How to Reduce Implicit Bias
Because of the harmful nature of implicit biases, it is critical to examine how we can begin to remove them.
Practicing mindfulness is one potential way, as it reduces the stress and cognitive load that otherwise leads to relying on such biases.
A 2016 study found that brief mediation decreased unconscious bias against black people and elderly people (Lueke & Gibson, 2016), providing initial insight into the usefulness of this approach and paving the way for future research on this intervention.
Adjust your perspective
Another method is perspective-taking – looking beyond your own point of view so that you can consider how someone else may think or feel about something.
Researcher Belinda Gutierrez implemented a videogame called “Fair Play,” in which players assume the role of a Black graduate student named Jamal Davis.
As Jamal, players experience subtle race bias while completing “quests” to obtain a science degree.
Gutierrez hypothesized that participants who were randomly assigned to play the game would have greater empathy for Jamal and lower implicit race bias than participants randomized to read narrative text (not perspective-taking) describing Jamal’s experience (Gutierrez, 2014), and her hypothesis was supported, illustrating the benefits of perspective taking in increasing empathy towards outgroup members.
Specific implicit bias training has been incorporated in different educational and law enforcement settings. Research has found that diversity training to overcome biases against women in STEM improved with men (Jackson, Hillard, & Schneider, 2014).
Training programs designed to target and help overcome implicit biases may also be beneficial for police officers (Plant & Peruche, 2005), but there is not enough conclusive evidence to completely support this claim. One pitfall of such training is a potential rebound effect.
Actively trying to inhibit stereotyping actually results in the bias eventually increasing more so than if it had not been initially suppressed in the first place (Macrae, Bodenhausen, Milne, & Jetten, 1994). This is very similar to the white bear problem that is discussed in many psychology curricula.
This concept refers to the psychological process whereby deliberate attempts to suppress certain thoughts make them more likely to surface (Wegner & Schneider, 2003).
Education is crucial. Understanding what implicit biases are, how they can arise how, and how to recognize them in yourself and others are all incredibly important in working towards overcoming such biases.
Learning about other cultures or outgroups and what language and behaviors may come off as offensive is critical as well. Education is a powerful tool that can extend beyond the classroom through books, media, and conversations.
On the bright side, implicit biases in the United States have been improving.
From 2007 to 2016, implicit biases have changed towards neutrality for sexual orientation, race, and skin-tone attitudes (Charlesworth & Banaji, 2019), demonstrating that it is possible to overcome these biases.
Books for further reading
As mentioned, education is extremely important. Here are a few places to get started in learning more about implicit biases:
- Biased: Uncovering the Hidden Prejudice That Shapes What We See Think and Do by Jennifer Eberhardt
- Blindspot by Anthony Greenwald and Mahzarin Banaji
- Implicit Racial Bias Across the Law by Justin Levinson and Robert Smith
Keywords and Terminology
To find materials on implicit bias and related topics, search databases and other tools using the following keywords:
“implicit bias” | “implicit gender bias” |
“unconscious bias” | “implicit prejudices” |
“hidden bias” | “implicit racial bias” |
“cognitive bias” | “Implicit Association Test” or IAT |
“implicit association” | “implicit social cognition” |
bias | prejudices |
“prejudice psychological aspects” | stereotypes |
Is unconscious bias the same as implicit bias?
Yes, unconscious bias is the same as implicit bias. Both terms refer to the biases we carry without awareness or conscious control, which can affect our attitudes and actions toward others.
In what ways can implicit bias impact our interactions with others?
Implicit bias can impact our interactions with others by unconsciously influencing our attitudes, behaviors, and decisions. This can lead to stereotyping, prejudice, and discrimination, even when we consciously believe in equality and fairness.
It can affect various domains of life, including workplace dynamics, healthcare provision, law enforcement, and everyday social interactions.
What are some implicit bias examples?
Some examples of implicit biases include assuming a woman is less competent than a man in a leadership role, associating certain ethnicities with criminal behavior, or believing that older people are not technologically savvy.
Other examples include perceiving individuals with disabilities as less capable or assuming that someone who is overweight is lazy or unmotivated.
Aboud, F. E. (1988). Children and prejudice . B. Blackwell.
Banaji, M. R., & Greenwald, A. G. (1995). Implicit gender stereotyping in judgments of fame. Journal of Personality and Social Psychology , 68 (2), 181.
Baron, A. S., Dunham, Y., Banaji, M., & Carey, S. (2014). Constraints on the acquisition of social category concepts. Journal of Cognition and Development , 15 (2), 238-268.
Bertrand, M., & Mullainathan, S. (2004). Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. American economic review , 94 (4), 991-1013.
Cameron, J. A., Alvarez, J. M., Ruble, D. N., & Fuligni, A. J. (2001). Children’s lay theories about ingroups and outgroups: Reconceptualizing research on prejudice. Personality and Social Psychology Review , 5 (2), 118-128.
Chapman, E. N., Kaatz, A., & Carnes, M. (2013). Physicians and implicit bias: how doctors may unwittingly perpetuate health care disparities. Journal of general internal medicine , 28 (11), 1504-1510.
Charlesworth, T. E., & Banaji, M. R. (2019). Patterns of implicit and explicit attitudes: I. Long-term change and stability from 2007 to 2016. Psychological science , 30(2), 174-192.
Goff, P. A., Jackson, M. C., Di Leone, B. A. L., Culotta, C. M., & DiTomasso, N. A. (2014). The essence of innocence: consequences of dehumanizing Black children. Journal of personality and socialpsychology,106(4), 526.
Greenwald, A. G., & Banaji, M. R. (1995). Implicit social cognition: attitudes, self-esteem, and stereotypes. Psychological review, 102(1), 4.
Greenwald, A. G., McGhee, D. E., & Schwartz, J. L. (1998). Measuring individual differences in implicit cognition: the implicit association test. Journal of personality and social psychology , 74(6), 1464.
Greenwald, A. G., & Krieger, L. H. (2006). Implicit bias: Scientific foundations. California Law Review , 94 (4), 945-967.
Gutierrez, B., Kaatz, A., Chu, S., Ramirez, D., Samson-Samuel, C., & Carnes, M. (2014). “Fair Play”: a videogame designed to address implicit race bias through active perspective taking. Games for health journal , 3 (6), 371-378.
Jackson, S. M., Hillard, A. L., & Schneider, T. R. (2014). Using implicit bias training to improve attitudes toward women in STEM. Social Psychology of Education , 17 (3), 419-438.
Johnson, T. J., Winger, D. G., Hickey, R. W., Switzer, G. E., Miller, E., Nguyen, M. B., … & Hausmann, L. R. (2017). Comparison of physician implicit racial bias toward adults versus children. Academic pediatrics , 17 (2), 120-126.
Kahneman, D. (2011). Thinking, fast and slow . Macmillan.
Lueke, A., & Gibson, B. (2016). Brief mindfulness meditation reduces discrimination. Psychology of Consciousness: Theory, Research, and Practice , 3 (1), 34.
Macrae, C. N., Bodenhausen, G. V., Milne, A. B., & Jetten, J. (1994). Out of mind but back in sight: Stereotypes on the rebound. Journal of personality and social psychology , 67 (5), 808.
Mekawi, Y., & Bresin, K. (2015). Is the evidence from racial bias shooting task studies a smoking gun? Results from a meta-analysis. Journal of Experimental Social Psychology , 61 , 120-130.
Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior , 4 , 265-292.
Pierce, C. (1970). Offensive mechanisms. The black seventies , 265-282.
Plant, E. A., & Peruche, B. M. (2005). The consequences of race for police officers’ responses to criminal suspects. Psychological Science , 16 (3), 180-183.
Rynders, D. (2019). Battling Implicit Bias in the IDEA to Advocate for African American Students with Disabilities. Touro L. Rev. , 35 , 461.
Sinclair, S., Dunn, E., & Lowery, B. (2005). The relationship between parental racial attitudes and children’s implicit prejudice. Journal of Experimental Social Psychology , 41 (3), 283-289.
Steffens, M. C., & Jelenec, P. (2011). Separating implicit gender stereotypes regarding math and language: Implicit ability stereotypes are self-serving for boys and men, but not for girls and women. Sex Roles , 64(5-6), 324-335.
Watson, S., Appiah, O., & Thornton, C. G. (2011). The effect of name on pre‐interview impressions and occupational stereotypes: the case of black sales job applicants. Journal of Applied Social Psychology , 41 (10), 2405-2420.
Wegner, D. M., & Schneider, D. J. (2003). The white bear story. Psychological Inquiry , 14 (3-4), 326-329.
Wigboldus, D. H., Sherman, J. W., Franzese, H. L., & Knippenberg, A. V. (2004). Capacity and comprehension: Spontaneous stereotyping under cognitive load. Social Cognition , 22 (3), 292-309.
Further Information
Test yourself for bias.
- Project Implicit (IAT Test) From Harvard University
- Implicit Association Test From the Social Psychology Network
- Test Yourself for Hidden Bias From Teaching Tolerance
- How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes a transcript
- Understanding Your Racial Biases With John Dovidio, Ph.D., Yale University From the American Psychological Association11:09 minutes; includes a transcript
- Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
- Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
- Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
- Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
- Impact of Implicit Bias on the Justice System 9:14 minutes
- Students Speak Up: What Bias Means to Them 2:17 minutes
- Weight Bias in Health Care From Yale University16:56 minutes
- Gender and Racial Bias In Facial Recognition Technology 4:43 minutes
Journal Articles
- An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal of Social Policy & the Law , 25, 27–59.
- Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4 , 265-292.
- Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American Journal of public health, 105 (12), e60-e76.
- Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887.
- Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
- Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
- Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.
The Harriet W. Sheridan Center for Teaching and Learning
Strategies and resources about implicit bias.
- Teaching Resources
- Inclusive and Anti-Racist Teaching
- Inclusive Teaching
What is a implicit bias?
Automatic preference… [for non-historically underrepresented groups that] predicts discriminatory behavior even among…[those] who earnestly (and, we believe, honestly] espouse egalitarian beliefs.
What are the effects of implicit bias?
Research suggests that implicit bias shapes both instructor-student and student-student interactions in the classroom, with outcomes such as:
- Influencing students’ course performance and desire to pursue a career in the discipline (Kiefer & Sekaquaptewa, 2007).
- Influencing instructor non-verbal behaviors (e.g., eye contact) to preference white students (Dovidio, Kawakami, & Gaertner, 2002).
- Male students underestimating the academic performance of female students, even when controlling for course grade and participation in the class (Grunspan, et al., 2016). Such dynamics could potentially influence female students’ peer assessment grade or influence their sense of belonging in the discipline. A negative student-student climate is also a strong predictor of student absenteeism (Wolbring & Treischl, 2016).
How can the expression of implicit bias be mitigated in the classroom?
- Take steps to make implicit biases explicit so they can intentionally be addressed. For example, instructors can take an Implicit Association Test (IAT, see below) to help better regulate implicit biases in the classroom. Several instructors describe positive results in classroom exercises to teach students about the concept of implicit bias, which also involve taking the IAT and facilitating discussions about the experience (e.g., Adams, Devos, Rivera, Smith & Vega, 2014; Goshal, Lippard, Robas, & Muir, 2012).
- Blind grading (i.e., hiding a student’s name on a paper or test) can eliminate the cues for implicit bias (Killpack & Melón, 2016). Transparent and clearly defined grading protocols (e.g., grading papers with rubrics, which are distributed to students in advance) also can provide structures to mitigate bias (Thompson & Sekaquaptewa, 2002).
- Exposure of the diversity of contributors to/members of the field may help lessen implicit bias. One study indicated that showing students images of African-American exemplars lessened IAT-identified racial preferences in the short term (Dasgupta & Greenwald, 2001). Banaji & Greenwald (2013, p. 151) suggest that a screensaver of counterstereotypical human images may have a similar effect for instructors.
- Create structures for more equitable participation in the classroom, especially to structure pair, team and group experiences (Thompson & Sekaquaptewa, 2002). Examples include clearly defined roles for group members.
Additional Resources
- Project Implicit : The Implicit Association Test (IAT) is available at the site. The IAT is designed to measure automatic attitudes and beliefs that may not be apparent to the respondent.
- Statts, C. (2015-16, Winter). Understanding implicit bias: What educators should know. American Educator, 39(4): 29-33, 43. This accessible article summarizes educational research on implicit bias and offers strategies to mitigate its effects.
- Killpack & Melón (2016). Toward Inclusive STEM Classrooms: What Personal Role Do Faculty Play? : This article offers a definition of implicit bias and three strategies that instructors can use to mitigate it. Although written in a STEM education journal, some strategies are broadly applicable to other disciplines (e.g., blind grading).
Adams, V.H., Devos, T., Roversa, L.M., Smith, H., & Vega, L.A. (2014). Teaching about implicit prejudices and stereotypes: A pedagogical demonstration. Teaching of Psychology, 41(3): 204-212.
Banaji, M.R. & Greenwald, A.G. (2013). Blindspot: Hidden biases of good people. New York: Delacorte Press.
Dasgupta, N., & Greenwald, A.G. (2001). On the malleability of racial attitudes: Combating automatic prejudice with images of admired and disliked individuals. Journal of Personality and Social Psychology, 81(5): 800-814.
Dovidio, J. F., Kawakami, K., & Gaertner, S. L. (2002). Implicit and explicit prejudice and interracial interaction. Journal of Personality and Social Psychology, 82, 62–68.
Goshal, R.A., Lippard, C., Robas, V., & Muir, K. (2012). Beyond bigotry: Teaching about unconscious prejudice. Teaching Sociology, 41(2): 130-143.
Grunspan, D.Z., Eddy, S.L., Brownell, S.E., Wiggins, B.L., Crowe, A.J., Goodreau, S.M. (2016). Males underestimate academic performance of their female peers in undergraduate biology classrooms. PLOS One, 11(2): Available: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0148405
Kiefer, A.K., & Sekaquaptewa, D. (2007). Implicit stereotypes, gender identification, and math-related outcomes: A prospective study of female college students. Psychological Science, 18(1): 13-18
Killpack, T. L., & Melón, L. C. (2016). Toward inclusive STEM classrooms: What personal role do faculty play? CBE Life Sciences Education, 15(3). Available: https://www.lifescied.org/doi/10.1187/cbe.16-01-0020
Thompson, M., & Sekaquaptewa, D. (2002). When being different is detrimental: Solo status and the performance of women and minorities. Analysis of Social Issues and Public Policy, 2(1): 183-203.
Wolbring, T., & Treischl, E. (2016). Selection bias in students’ evaluation of teaching: Causes of student absenteeism and its consequences for course ratings and rankings. Research in Higher Education, 57: 51-71.
What To Expect
The unit is constructed around three short videos that explain different facets, or features, of implicit bias, including examples of creative research that demonstrate how implicit bias works and how people can counteract its effects. You'll view each video and follow the instructions to:
- take notes (in words, phrases, or complete sentences),
- answer questions (using complete sentences),
- complete surveys on your personal reflections, and/or
- write a summary (in the form of a paragraph or two).
There are three main purposes of these assignments:
- To help you understand the societal impact of implicit bias and remember what you've learned.
- To facilitate personal reflection on how implicit bias has influenced your life.
- To provide you with a comprehensive set of notes to complete your final assignment, an essay. You will synthesize facts you’ve learned with personal reflections to create an evidence-based opinion piece, answering the question: What role does implicit bias play in human relationships and what can be done about it?
Note: Learning about implicit bias can remind us of times when we have mistreated others or when we have been treated unfairly. If you encounter moments that are disturbing, find people you can talk to; classmates, family members, or your teacher would all be good choices.
So, if you're ready to start exploring a really important component of human interactions in the contemporary world, go to the next page.
Interactive Lesson Sign In
Sign in to your PBS LearningMedia account to save your progress and submit your work, or continue as a guest.
More From Forbes
Understanding biases and their impact on our perceptions.
- Share to Facebook
- Share to Twitter
- Share to Linkedin
Has anyone ever accused you of being biased? What was your reaction? The typical reaction is “Biased? Not me!” If that was the case, I am sorry to burst your bubble, but everyone has multiple forms and dimensions of cognitive biases. In my decades of experience in running businesses and developing others, I have found that our personal biases get in the way of good results more than any other factor. Identifying your biases is a very important part of the self-awareness journey that leads us to be more emotionally intelligent human beings, as well as better business leaders.
Biases have been studied extensively in both psychology and behavioral economics. There is a lot of discussion over whether all biases are negative or if some can result in useful attitudes or behaviors.
Cognitive bias is a general term that many psychologists and other behavioral experts use to describe a systematic error in how people perceive others or their environment. Individuals, whether we are talking about our neighbors or coworkers, filter or perceive information based on their own past experiences. When an individual constructs their own subjective social reality based on their past perceptions and not on objective input, we classify their behavior as being cognitively biased.
The list of cognitive biases is evolving, with nearly 200 already classified. It is beyond the scope of this article to help you understand all of these biases, but I will touch on the few that I often see affecting personal and business relationships.
Fundamental Attribution Error (FAE): FAE happens when we are too quick to label people or their actions. These labels are hard to erase and can lead to a tainted view of a situation. What gives rise to FAE is the assumption that what a person does reflects who they are. For example, if someone acts badly, we assume that they did so because they have bad character, but if we ourselves were in the same situation and acted the same way, then we did so because the tough position or environment we found ourselves in required us to act that way. This also often reflects how people perceive others' accomplishments and whether or not we think they deserve the rewards associated with their actions.
In organizations, people constantly falter due to FAE. There are several ways to avoid falling into this trap. For starters, assume positive intent on the part of the other person. People are not inherently evil or devious. Consider any external factors that might have caused the other person’s behavior, and think about how you might have dealt with those factors. If you still cannot figure it out, then ask the person, in a nonjudgmental way, about their behavior. After all, no one has better insight into what they were thinking than the person themselves.
Confirmation Bias: Confirmation bias is one of the most commonly occurring judgment biases. This has been widely described as a tendency to search for validation and ways to reaffirm our preexisting beliefs or hypotheses . It is a trick our minds play to highlight small pieces of information that confirm what we already believe. Once you become aware of it, you will start to notice yourself doing it all the time. In order to overcome confirmation bias, especially when working with others in a workplace, you need to ask yourself at each step of the way if you are being as objective and unbiased as possible. It helps to mentally consider the opposite of your belief in search for the truth or try to prove yourself wrong. You can only discover the truth when you have considered all of the facts, including those that do not support your original belief.
Self-Serving Bias: This process is when we perceive a situation or facts in a way that allows us to see ourselves and our actions in the most positive and advantageous light possible. This is something many of us do without even realizing. Think about it: We often give ourselves credit for good outcomes but do not blame ourselves for the bad ones. It does not mean that we have to take all the blame for bad outcomes, but we should be willing to explore our role in what happened and reflect on the external factors that might have influenced the outcome. The field of positive psychology advocates holding onto a positive self-image, and that absolutely works, as long as we are balanced in our view of objective facts. Like a lot of other biases, the antidote starts with honest reflection and self-awareness.
In talking with people about whether they are biased, I have often posed two simple questions: “Is it true?” and then “How do you know it’s true?”Almost every time the answer I get to the first question is a resounding yes. What follows in exploring the second question is where true learning starts to happen. After a healthy exploration of the objective facts, including the possible beliefs and environmental conditions affecting all involved, people begin to understand that they have subjectively created what they believe to be a reality.
Reducing biases is an important part of our personal and business lives, particularly with respect to judgment and decision making. Biased judgment and decision making exist in all domains, including every industry and our everyday lives. The only way you can change this is to become aware of it. The next time you are at odds with someone, ask yourself, “Is my belief true?” and “How do I know it is true?”
- Editorial Standards
- Forbes Accolades
Let your curiosity lead the way:
Apply Today
- Arts & Sciences
- Graduate Studies in A&S
Understanding your biases
Two WashU researchers who conduct studies on bias and its impacts, Calvin Lai and Clara Wilkins, explain the roots and consequences of bias and how we can potentially reduce it.
If there is one thing you need to know about biases, it is that you have them.
When we see the word “bias” in the news, it is usually in connection with a terrible injustice, like someone being passed over for a job, or worse, targeted by law enforcement because of their gender, race, or nationality. We tend to think of people who behave in biased ways as bad people who take extreme actions to exclude others. No one wants to admit to being biased.
According to researchers in psychological and brain sciences, however, biases are often at least partly unconscious. Despite this, they profoundly impact way we interact with the world and tend to perpetuate much of the inequality that exists in our society.
The basics: What is bias?
If we want to decrease harmful biases, we need to first understand what bias is. Clara Wilkins, assistant professor of psychological and brain sciences, says that when she teaches bias in the classroom, she breaks it down into three components that are often referred to as the “ABCs” of bias. The “A,” or affective component, is what we would call prejudice, or negative feelings toward a person that are based on his or her group membership, the “C” or cognitive component is stereotypes, or generalizations about a group, and the “B,” or behavioral component, is discrimination, or the actual actions taken against a person based on their group membership. Wilkins’ Social Perception and Intergroup Attitudes Lab is interested in studying all of these components of bias.
Calvin Lai, assistant professor of psychological and brain sciences, says that although the bias we hear about in the news is usually harmful, bias itself is not always negative. He says that, “the way that psychological scientists define bias is just a tendency to respond one way compared to another when making some kind of a life choice.” Sometimes these biases can be completely neutral, like a bias for Coke over Pepsi, and can even be helpful in allowing you to make decisions more rapidly.
Not all biases are so harmless, however. As Lai notes, “Bias can often lead us in directions that we don’t expect, that we don’t intend, and that we might even disagree with if we knew that it was nudging us in a particular way.” These are the kinds of biases that can be harmful when people allow them to impact their behavior toward certain groups, and the biases that his Diversity Science Lab is attempting to redirect through their research.
Wilkins states that most people are hesitant to see themselves as participating in bias, but that we need to be aware that we can behave in harmful ways, even if we consciously support equality. She says, “Good people also exhibit bias. I think if we have this image of a racist person as a member of the KKK who does something really really violent, that is going to exclude a lot of acts that actually reinforce social inequality.” Understanding that even “good” people can be biased allows us to be more open to exploring our own biases and take actions to prevent acting on them.
“Bias can often lead us in directions that we don’t expect, that we don’t intend."
Studying unconscious biases
Because so many people are reluctant to admit, and are often even unaware of, their biases, it is difficult for researchers to learn what biases the participants they are studying hold. To counter this problem, researchers have developed something called the Implicit Association Test.
Harvard first developed the Implicit Association Test in 1998 to test peoples’ implicit biases by looking at how strongly they associate different concepts with different groups of people. Lai is the Director of Research at Project Implicit, a non-profit that uses online IATs both to collect research data and to help inform the general public about biases, and says that the vast amount of data collected through these tests over the last two decades has allowed researchers to track biases and see how certain demographic factors, including a person’s location, age, and race, can impact their biases.
IATs have consistently shown that people are faster to associate white people with good things and black people with bad things than vice versa, which demonstrates how these pairings are subconsciously linked together in their memories. Researchers have also developed different IATs to test for the associations participants make on the basis of gender, religion, weight, sexuality, age, and a host of other identity categories. If you want to see what one of these tests is like, you can take one yourself at Project Implicit .
Consequences of bias
IATs have a lot to tell us about the possible prevalence and consequences of bias. Lai states that there is a correlation between how people perform on IATs and the way they behave toward different groups. He states, “We do find that these implicit biases do correlate with how people act, and they often do so over and above what people can report, what they can actually say about themselves.” He has worked with a number of different populations to help them understand their biases better, and how those biases can lead to certain groups being treated differently in healthcare, academia, and the job market.
Biases have the potential to do the most harm when they are acted on by people in positions of relative power, whether they be healthcare professionals, employers, or law enforcement officers. We have all heard in the news of how bias can even lead to deadly encounters in situations in which people have to make snap judgments about the risk a person poses. In one of his current projects, Lai is working in collaboration with the Anti-Defamation League to help police officers better understand their biases. His research will help determine if an educational workshop on biases can impact the way law enforcement interacts with different populations.
Wilkins also studies how bias manifests in groups with power differentials. Wilkins has found that groups that believe that the current hierarchy is fair tend to double-down on these beliefs and behave in a more discriminatory way when they feel that this hierarchy is being threatened. In a recent study, Wilkins found that men who held status-legitimizing beliefs were more likely to penalize women when reviewing job resumes after being exposed to an article about men being discriminated against. This finding is particularly troubling because she has also found evidence that men have been perceiving men to be the victims of discrimination more often in recent years, which means that these reactionary behaviors against women might also be increasing.
"It is sort of ironic, your idea about fairness actually leads you to behave in an unfair way."
Wilkins explains status-legitimizing beliefs by saying, “Society is structured where some groups have better access to resources than others, so they have more income, wealth, power, etc. than other groups. Status-legitimizing ideologies are ideologies that make that inequality seem fair and legitimate.” She used the idea of meritocracy as an example of a status-legitimizing belief, saying that if people believe that hard work always results in success, they will be likely to see people who are struggling financially as simply not working hard enough and overlook structural inequalities in things like access to education and healthcare. She says, “It is sort of ironic, your idea about fairness actually leads you to behave in an unfair way.”
Wilkins says that opposition to affirmative action is an example of the way status-legitimizing beliefs can make it difficult for people to acknowledge structural inequalities like the ones that were illuminated with the recent admissions scandal involving wealthy parents. “There are a lot of people who are opposed to affirmative action because they think it disadvantages people who are not racial minorities, when there are other structural things like donations or legacy admissions or other things that aren’t based just on merit that disadvantage particular groups.”
Wilkins says that some of the backlash we witnessed after Obama’s presidency is a result of perceived threats to the status-quo that causes groups with power, like white men, to behave negatively toward groups they see as potentially disrupting traditional power dynamics. This reaction might be caused by zero-sum beliefs that see increases in rights/advantages for one group as automatically decreasing advantages for another. These beliefs are often so deeply held that people might not even consciously recognize that they have them, but the can significantly impact behavior.
Avoiding biased actions
So how do we avoid being biased? When it comes to changing your implicit unconscious biases, like the ones the IAT tests for, research has consistently shown that it is more difficult than you would think. Lai says, “It does seem that trying to change implicit bias right now directly, trying to reduce these associations that swirl around in our head, that seems really hard. They are built up over a lifetime of experience and it seems that if you can change them, it requires a lot of sustained effort.” In other words, a quick diversity training, while potentially helpful in getting people to start thinking about their biases, is not going to immediately change the way their brains associate white people with good things and black people with negative things.
Wilkins similarly says that she does not believe that progress toward a less biased world is linear. As her research has shown, the societal changes that we might see as progress are often accompanied by backlash when threats to the established order cause people to double down on their biases, whether consciously or unconsciously.
In spite of these somewhat bleak findings, however, both Lai and Wilkins are optimistic that there are things that we can do to reduce biased actions, even if we can’t completely eliminate biased thoughts. Recently, Wilkins has been researching ways to reduce people’s zero-sum beliefs. She is currently working on a study in which she has Christians read a Bible verse that promotes tolerance to illustrate that acceptance of different groups, like LGBTQ individuals, is not incompatible with Christian values. So far she has found that exposing participants to this verse decreases zero-sum beliefs and increases tolerance. Although she does not yet know how permanent these changes will prove to be, she is taking it as a hopeful sign.
For readers who want to behave with less bias, Lai and Wilkins both say that being aware of your bias is the first step. Lai argued for a systematic approach to tracking our own biases: “I think the big thing is, we’re all susceptible to bias. It’s really important to just keep records, track, in your own life where it might be happening, and then design good procedures or habits so you don’t act on them.” He says that there are simple things that people can do to avoid letting their biases influence decisions, like blanking out names on resumes so that an applicant’s gender or racial identity can’t influence hiring decisions.
Wilkins also says that we should be more aware of why we are drawn to certain people over others, and to go out of our way to avoid acting on these biases. “None of it is an easy solution,” she says, “I think it also requires a lot of motivation… It is not enough to not be racist or sexist. There need to be anti-racist and anti-sexist efforts because these behaviors are so entrenched in our society that it will be difficult to make real, sustained progress.”
in the news:
Friedman Center grant to study experiences of ageism
Deanna Barch Receives National Academy of Medicine’s Sarnat Prize for Contributions to Research and Treatment of Mental Health Disorders
Environmental Organizing Fellowship Opportunity for Washington University Students
Help with Call for Psychology Research
Main navigation
Implicit bias module series.
Welcome to the Kirwan Institute for the Study of Race & Ethnicity’s Implicit Bias Module Series. At the Kirwan Institute at The Ohio State University, we are committed to the creation of a just and inclusive society, where all people and communities have the opportunity to succeed. Our commitment to this mission is why we work so hard to understand and overcome barriers that prevent access to opportunity in our society, such as implicit bias and racial disparities in our education system.
This course will introduce you to insights about how our minds operate and help you understand the origins of implicit associations. You will also uncover some of your own biases and learn strategies for addressing them. Each module is divided into a short series of lessons, many taking less than 10 minutes to complete. That way, even if you’re pressed for time, you can complete the lessons and modules at your convenience.
We are excited that you are starting this process to explore implicit bias and what its operation means for your decisions and actions. Thank you for joining us!
The Kirwan Institute for the Study of Race and Ethnicity is an interdisciplinary engaged research institute at The Ohio State University established in May 2003. As a racial equity research institute, our goal is to connect individuals and communities with opportunities needed for thriving by educating the public, building the capacity of allied social justice organizations, and investing in efforts that support racial equity and inclusion. Here at the Kirwan Institute, we do this through research, engagement, and communication.
Accessing the Videos
If you experience technical difficulties accessing the videos embedded in this module series, consider switching your browser to Chrome or Microsoft Edge.
Implicit Bias Module Series video transcript
Why focus on implicit rather than explicit bias?
Although our modules focus primarily on implicit bias, Kirwan acknowledges that inclusion and equity efforts must also address explicit bias and discrimination in order to create real change. Our explicit and implicit attitudes are related constructs, and many times peoples’ implicit and explicit attitudes are in alignment.
However, even though concepts are related, they are distinct. Someone can act in a biased manner based on their implicit associations, even if they do not indicate an explicit preference for certain individuals or groups. Learning about implicit bias provides a lens to help examine the causes of racial, gender, or other social disparities, even in the absence of explicit intent to discriminate or cause harm.
How long will it take me to complete these modules?
While everyone works at a different pace, these modules should take participants roughly 45 minutes to 1.5 hours to complete, depending on the time spent on activities and supplemental exercises.
Are these modules supported by research?
Yes. The methods and research shared in these modules is supported by our annual flagship publication, the State of the Science Implicit Bias Review . Each year, Kirwan researchers compile studies on the subject of implicit bias into an interdisciplinary literature review in a format that is accessible and easy to understand from a wide range of academic and professional backgrounds.
Does implicit bias reflect my beliefs about equity or inclusion?
We will get more into this during Module 1, but our implicit preferences do not necessarily align with our explicit beliefs. For example, one can believe in equality of all people and still hold a pro-self-identity bias. Importantly, some people possess implicit attitudes that do not align with their own held identities.
If I didn’t intend to be biased in the first place, how will learning about implicit bias help?
Becoming aware of what biases you possess and the decisions that are most likely to be influenced by our unconscious processing can help you build interventions and strategies to prevent the expression of bias and unwanted outcomes. Our training also includes information about empirically based interventions to both reduce the expression of bias and alter the associations we possess.
What is the connection between implicit bias and how people act?
Implicit bias has been shown to impact decision-making across a wide array of sectors, including employment, medicine, and education. However, there are limitations to the extent to which unconscious biases can predict individual behavior. People with an implicit preference for one identity may not act on this bias or make biased decisions, much of this depends on the circumstance.
Does this training apply to me if I don’t have implicit bias?
We will touch on this during the modules, but because biases can be activated across a wide-range of identities, we all hold some implicit preferences. However, even people without an implicit preference can still act in ways that produce discriminatory behavior, such as not speaking up when they see bias in their environment. This need to translate knowledge into action is why Kirwan also emphasizes the importance of being an Active Bystander.
Will these modules eliminate bias?
These modules are not designed to eliminate bias. Rather, we hope that an awareness of implicit bias and how it operates will help participants engage in more equitable decision-making practices and behaviors. These strategies to reduce the expression of bias are only one piece of the puzzle and should be complemented by policies and strategies to address institutional and explicit discrimination.
I’m not a K–12 educator, are these modules for me?
This module series includes examples and activities that are uniquely tailored to the experience of those who work in a K–12 education setting or closely-related field. However, much of the content in Modules 1, 3, and 4 are generalizable enough to apply to most audiences.
- Printer-friendly version
- View all sections on a single page
- SUGGESTED TOPICS
- The Magazine
- Newsletters
- Managing Yourself
- Managing Teams
- Work-life Balance
- The Big Idea
- Data & Visuals
- Case Selections
- HBR Learning
- Topic Feeds
- Account Settings
- Email Preferences
Unconscious Bias Training That Works
- Francesca Gino
- Katherine Coffman
To become more diverse, equitable, and inclusive, many companies have turned to unconscious bias (UB) training. By raising awareness of the mental shortcuts that lead to snap judgments—often based on race and gender—about people’s talents or character, it strives to make hiring and promotion fairer and improve interactions with customers and among colleagues. But most UB training is ineffective, research shows. The problem is, increasing awareness is not enough—and can even backfire—because sending the message that bias is involuntary and widespread may make it seem unavoidable.
UB training that gets results, in contrast, teaches attendees to manage their biases, practice new behaviors, and track their progress. It gives them information that contradicts stereotypes and allows them to connect with colleagues whose experiences are different from theirs. And it’s not a onetime session; it entails a longer journey and structural organizational changes.
In this article the authors describe how rigorous UB programs at Microsoft, Starbucks, and other organizations help employees overcome denial and act on their awareness, develop the empathy that combats bias, diversify their networks, and commit to improvement.
Increasing awareness isn’t enough. Teach people to manage their biases, change their behavior, and track their progress.
Idea in Brief
The problem.
Conventional training to combat unconscious bias and make the workplace more diverse, equitable, and inclusive isn’t working.
This training aims to raise employees’ awareness of biases based on race or gender. But by also sending the message that such biases are involuntary and widespread, it can make people feel that they’re unavoidable.
The Solution
Companies must go beyond raising awareness and teach people to manage biases and change behavior. Firms should also collect data on diversity, employees’ perceptions, and training effectiveness; introduce behavioral “nudges”; and rethink policies.
Across the globe, in response to public outcry over racist incidents in the workplace and mounting evidence of the cost of employees’ feeling excluded, leaders are striving to make their companies more diverse, equitable, and inclusive. Unconscious bias training has played a major role in their efforts. UB training seeks to raise awareness of the mental shortcuts that lead to snap judgments—often based on race and gender—about people’s talents or character. Its goal is to reduce bias in attitudes and behaviors at work, from hiring and promotion decisions to interactions with customers and colleagues.
- Francesca Gino is a behavioral scientist and the Tandon Family Professor of Business Administration at Harvard Business School. She is the author of Rebel Talent and Sidetracked . francescagino
- KC Katherine Coffman is an associate professor of business administration at Harvard Business School. Her research focuses on how stereotypes affect beliefs and behavior.
Partner Center
- Bipolar Disorder
- Therapy Center
- When To See a Therapist
- Types of Therapy
- Best Online Therapy
- Best Couples Therapy
- Managing Stress
- Sleep and Dreaming
- Understanding Emotions
- Self-Improvement
- Healthy Relationships
- Student Resources
- Personality Types
- Guided Meditations
- Verywell Mind Insights
- 2024 Verywell Mind 25
- Mental Health in the Classroom
- Editorial Process
- Meet Our Review Board
- Crisis Support
How Does Implicit Bias Influence Behavior?
Strategies to Reduce the Impact of Implicit Bias
Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."
Akeem Marsh, MD, is a board-certified child, adolescent, and adult psychiatrist who has dedicated his career to working with medically underserved communities.
Getty Images
- Measurement
- Discrimination
An implicit bias is an unconscious association, belief, or attitude toward any social group. Implicit biases are one reason why people often attribute certain qualities or characteristics to all members of a particular group, a phenomenon known as stereotyping .
It is important to remember that implicit biases operate almost entirely on an unconscious level . While explicit biases and prejudices are intentional and controllable, implicit biases are less so.
A person may even express explicit disapproval of a certain attitude or belief while still harboring similar biases on a more unconscious level. Such biases do not necessarily align with our own sense of self and personal identity. People can also hold positive or negative associations about their own race, gender, religion, sexuality, or other personal characteristics.
Causes of Implicit Bias
While people might like to believe that they are not susceptible to these implicit biases and stereotypes, the reality is that everyone engages in them whether they like it or not. This reality, however, does not mean that you are necessarily prejudiced or inclined to discriminate against other people. It simply means that your brain is working in a way that makes associations and generalizations.
In addition to the fact that we are influenced by our environment and stereotypes that already exist in the society into which we were born, it is generally impossible to separate ourselves from the influence of society.
You can, however, become more aware of your unconscious thinking and the ways in which society influences you.
It is the natural tendency of the brain to sift, sort, and categorize information about the world that leads to the formation of these implicit biases. We're susceptible to bias because of these tendencies:
- We tend to seek out patterns . Implicit bias occurs because of the brain's natural tendency to look for patterns and associations in the world. Social cognition , or our ability to store, process, and apply information about people in social situations, is dependent on this ability to form associations about the world.
- We like to take shortcuts . Like other cognitive biases , implicit bias is a result of the brain's tendency to try to simplify the world. Because the brain is constantly inundated with more information than it could conceivably process, mental shortcuts make it faster and easier for the brain to sort through all of this data.
- Our experiences and social conditioning play a role . Implicit biases are influenced by experiences, although these attitudes may not be the result of direct personal experience. Cultural conditioning, media portrayals, and upbringing can all contribute to the implicit associations that people form about the members of other social groups.
How Implicit Bias Is Measured
The term implicit bias was first coined by social psychologists Mahzarin Banaji and Tony Greenwald in 1995. In an influential paper introducing their theory of implicit social cognition, they proposed that social behavior was largely influenced by unconscious associations and judgments.
In 1998, Banaji and Greenwald published their now-famous Implicit Association Test (IAT) to support their hypothesis . The test utilizes a computer program to show respondents a series of images and words to determine how long it takes someone to choose between two things.
Subjects might be shown images of faces of different racial backgrounds, for example, in conjunction with either a positive word or a negative word. Subjects would then be asked to click on a positive word when they saw an image of someone from one race and to click on a negative word when they saw someone of another race.
Interpreting the Results
The researchers suggest that when someone clicks quickly, it means that they possess a stronger unconscious association. If a person quickly clicks on a negative word every time they see a person of a particular race, the researchers suggest that this would indicate that they hold an implicit negative bias toward individuals of that race.
In addition to a test of implicit racial attitudes, the IAT has also been utilized to measure unconscious biases related to gender, weight, sexuality, disability, and other areas. The IAT has grown in popularity and use over the last decade, yet has recently come under fire.
Among the main criticisms are findings that the test results may lack reliability . Respondents may score high on racial bias on one test, and low the next time they are tested.
Also of concern is that scores on the test may not necessarily correlate with individual behavior. People may score high for a type of bias on the IAT, but those results may not accurately predict how they would relate to members of a specific social group.
Link Between Implicit Bias and Discrimination
It is important to understand that implicit bias is not the same thing as racism, although the two concepts are related. Overt racism involves conscious prejudice against members of a particular racial group and can be influenced by both explicit and implicit biases.
Other forms of discrimination that can be influenced by unconscious biases include ageism , sexism, homophobia, and ableism.
One of the benefits of being aware of the potential impact of implicit social biases is that you can take a more active role in overcoming social stereotypes, discrimination, and prejudice.
Effects of Implicit Bias
Implicit biases can influence how people behave toward the members of different social groups. Researchers have found that such bias can have effects in a number of settings, including in school, work, and legal proceedings.
Implicit Bias in School
Implicit bias can lead to a phenomenon known as stereotype threat in which people internalize negative stereotypes about themselves based upon group associations. Research has shown, for example, that young girls often internalize implicit attitudes related to gender and math performance.
By the age of 9, girls have been shown to exhibit the unconscious beliefs that females have a preference for language over math. The stronger these implicit beliefs are, the less likely girls and women are to pursue math performance in school. Such unconscious beliefs are also believed to play a role in inhibiting women from pursuing careers in science, technology, engineering, and mathematics (STEM) fields.
Studies have also demonstrated that implicit attitudes can also influence how teachers respond to student behavior, suggesting that implicit bias can have a powerful impact on educational access and academic achievement.
One study, for example, found that Black children—and Black boys in particular—were more likely to be expelled from school for behavioral issues. When teachers were told to watch for challenging behaviors, they were more likely to focus on Black children than on White children.
Implicit Bias In the Workplace
While the Implicit Attitude Test itself may have pitfalls, these problems do not negate the existence of implicit bias. Or the existence and effects of bias, prejudice, and discrimination in the real world. Such prejudices can have very real and potentially devastating consequences.
One study, for example, found that when Black and White job seekers sent out similar resumes to employers, Black applicants were half as likely to be called in for interviews as White job seekers with equal qualifications.
Such discrimination is likely the result of both explicit and implicit biases toward racial groups.
Even when employers strive to eliminate potential bias in hiring, subtle implicit biases may still have an impact on how people are selected for jobs or promoted to advanced positions. Avoiding such biases entirely can be difficult, but being aware of their existence and striving to minimize them can help.
Implicit Bias in Healthcare Settings
Certainly, age, race, or health condition should not play a role in how patients get treated, however, implicit bias can influence quality healthcare and have long-term impacts including suboptimal care, adverse outcomes, and even death.
For example, one study published in the American Journal of Public Health found that physicians with high scores in implicit bias tended to dominate conversations with Black patients and, as a result, the Black patients had less confidence and trust in the provider and rated the quality of their care lower.
Researchers continue to investigate implicit bias in relation to other ethnic groups as well as specific health conditions, including type 2 diabetes, obesity, mental health, and substance use disorders.
Implicit Bias in Legal Settings
Implicit biases can also have troubling implications in legal proceedings, influencing everything from initial police contact all the way through sentencing. Research has found that there is an overwhelming racial disparity in how Black defendants are treated in criminal sentencing.
Not only are Black defendants less likely to be offered plea bargains than White defendants charged with similar crimes, but they are also more likely to receive longer and harsher sentences than White defendants.
Strategies to Reduce the Impact of Implict Bias
Implicit biases impact behavior, but there are things that you can do to reduce your own bias. Some ways that you can reduce the influence of implicit bias:
- Focus on seeing people as individuals . Rather than focusing on stereotypes to define people, spend time considering them on a more personal, individual level.
- Work on consciously changing your stereotypes . If you do recognize that your response to a person might be rooted in biases or stereotypes, make an effort to consciously adjust your response.
- Take time to pause and reflect . In order to reduce reflexive reactions, take time to reflect on potential biases and replace them with positive examples of the stereotyped group.
- Adjust your perspective . Try seeing things from another person's point of view. How would you respond if you were in the same position? What factors might contribute to how a person acts in a particular setting or situation?
- Increase your exposure . Spend more time with people of different racial backgrounds. Learn about their culture by attending community events or exhibits.
- Practice mindfulness . Try meditation, yoga, or focused breathing to increase mindfulness and become more aware of your thoughts and actions.
While implicit bias is difficult to eliminate altogether, there are strategies that you can utilize to reduce its impact. Taking steps such as actively working to overcome your biases , taking other people's perspectives, seeking greater diversity in your life, and building your awareness about your own thoughts are a few ways to reduce the impact of implicit bias.
A Word From Verywell
Implicit biases can be troubling, but they are also a pervasive part of life. Perhaps more troubling, your unconscious attitudes may not necessarily align with your declared beliefs. While people are more likely to hold implicit biases that favor their own in-group, it is not uncommon for people to hold biases against their own social group as well.
The good news is that these implicit biases are not set in stone. Even if you do hold unconscious biases against other groups of people, it is possible to adopt new attitudes, even on the unconscious level. This process is not necessarily quick or easy, but being aware of the existence of these biases is a good place to start making a change.
Jost JT. The existence of implicit bias is beyond reasonable doubt: A refutation of ideological and methodological objections and executive summary of ten studies that no manager should ignore . Research in Organizational Behavior . 2009;29:39-69. doi:10.1016/j.riob.2009.10.001
Greenwald AG, Mcghee DE, Schwartz JL. Measuring individual differences in implicit cognition: The implicit association test . J Pers Soc Psychol. 1998;74(6):1464-1480. doi:10.1037/0022-3514.74.6.1464
Sabin J, Nosek BA, Greenwald A, Rivara FP. Physicians' implicit and explicit attitudes about race by MD race, ethnicity, and gender . J Health Care Poor Underserved. 2009;20(3):896-913. doi:10.1353/hpu.0.0185
Capers Q, Clinchot D, McDougle L, Greenwald AG. Implicit racial bias in medical school admissions . Acad Med . 2017;92(3):365-369. doi:10.1097/ACM.0000000000001388
Kiefer AK, Sekaquaptewa D. Implicit stereotypes and women's math performance: How implicit gender-math stereotypes influence women's susceptibility to stereotype threat . Journal of Experimental Social Psychology. 2007;43(5):825-832. doi:10.1016/j.jesp.2006.08.004
Steffens MC, Jelenec P, Noack P. On the leaky math pipeline: Comparing implicit math-gender stereotypes and math withdrawal in female and male children and adolescents . Journal of Educational Psychology. 2010;102(4):947-963. doi:10.1037/a0019920
Edward Zigler Center in Child Development & Social Policy, Yale School of Medicine. Implicit Bias in Preschool: A Research Study Brief .
Pager D, Western B, Bonikowski B. Discrimination in a low-wage labor market: A field experiment . Am Sociol Rev. 2009;74(5):777-799. doi:10.1177/000312240907400505
Malinen S, Johnston L. Workplace ageism: Discovering hidden bias . Exp Aging Res. 2013;39(4):445-465. doi:10.1080/0361073X.2013.808111
Cooper LA, Roter DL, Carson KA, et al. The associations of clinicians' implicit attitudes about race with medical visit communication and patient ratings of interpersonal care . Am J Public Health . 2012;102(5):979-87. doi:10.2105/AJPH.2011.300558
Leiber MJ, Fox KC. Race and the impact of detention on juvenile justice decision making . Crime & Delinquency. 2005;51(4):470-497. doi:10.1177/0011128705275976
Van Ryn M, Hardeman R, Phelan SM, et al. Medical school experiences associated with change in implicit racial bias among 3547 students: A medical student CHANGES study report . J Gen Intern Med. 2015;30(12):1748-1756. doi:10.1007/s11606-015-3447-7
By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."
Definition of Assignment Bias
Assignment bias refers to a type of bias that occurs in research or experimental studies when the assignment of participants to different groups or conditions is not randomized or is influenced by external factors.
Understanding Assignment Bias
Randomized assignment or allocation of participants to different groups is a fundamental principle in research studies that aims to eliminate assignment bias.
Causes of Assignment Bias
Assignment bias can arise due to several reasons:
- Non-randomized allocation: When participants are not randomly assigned to different groups, their characteristics may influence the assignment, introducing bias into the study. This can occur when researchers purposefully assign participants based on certain characteristics or when participants self-select into a specific group.
- External factors: Factors external to the research design, such as the preferences of researchers or unequal distribution of participants based on certain characteristics, may unintentionally affect the assignment process.
- Selection bias: If participants are not selected randomly from the population under study, the assignment process can be biased, impacting the validity and generalizability of the results.
Effects of Assignment Bias
Assignment bias can have various consequences:
- Inaccurate estimation: The inclusion of biased assignment methods can lead to inaccurate estimations of treatment effects, making it difficult to draw reliable conclusions from the study.
- Reduced internal validity: Assignment bias threatens the internal validity of a study because it hampers the ability to establish a causal relationship between the independent variable and the observed outcomes.
- Compromised generalizability: The presence of assignment bias may limit the generalizability of research findings to a larger population, as the biased assignment may not appropriately represent the target population.
Strategies to Minimize Assignment Bias
To minimize assignment bias, researchers can undertake the following strategies:
- Randomization: Random allocation of participants to different groups reduces the likelihood of assignment bias by ensuring that each participant has an equal chance of being assigned to any group.
- Blinding: Adopting blind procedures, such as single-blind or double-blind designs, helps prevent the influence of researcher or participant bias on the assignment process.
- Stratification: Stratifying participants based on certain important variables prior to assignment can ensure a balance of these variables across different groups and minimize the impact of confounding factors.
Skip to Content
Public safety personnel delve into better understanding bias, inequities
- Share via Twitter
- Share via Facebook
- Share via LinkedIn
- Share via E-mail
Division of Public Safety members engaging in Roots of Inequities workshop, Oct. 2, 2024.
CU Boulder’s Division of Public Safety is continuing its commitment to learning and growth, tackling the root causes of inequities and bias in a workshop by Darrell Hammond Sr. of Higher Ground Consulting , a Boulder County-based leadership and professional development organization.
“I’m continually impressed by this group’s investment in their culture and their willingness to have those hard conversations,” said Hammond. “They are doing the work to examine their values, which guides their behaviors,” he added.
Along with focusing on historical perspectives on power structure and bias, the Roots of Inequities workshop encouraged commissioned officers and professional staff alike to develop a healthy mindset, which helps those in authority stay curious about their impact on others.
This training builds on a previous workshop, Mindset Matters , which DPS team members completed over the summer. “A healthy mindset helps us remember that others matter the same way I matter, which is an important step in examining our impact,” said Hammond.
“This training is one of many that we’ve engaged in as we strive to provide the highest quality of public safety services to our diverse community,” said Interim Chief John Monahan.
Campus Community
Related articles.
Educate, engage, enforce: CU Boulder’s philosophy of engaging with free expression
How to get ready for game day
Help reduce nutrient and E. coli contamination in Boulder Creek
News headlines.
- Arts & Humanities
- Business & Entrepreneurship
- Climate & Environment
- Education & Outreach
- Health & Society
- Law & Politics
- Science & Technology
- Administration
- Announcements & Deadlines
- Career Development
- Getting Involved
- Mind & Body
Events & Exhibits
- Arts & Culture
- Conferences
- Lectures & Presentations
- Performances & Concerts
- Sports & Recreation
- Workshops & Seminars
Subscribe to CUBT
Sign up for Alerts
Administrative eMemos
Buff Bulletin Board
Events Calendar
Have a language expert improve your writing
Run a free plagiarism check in 10 minutes, generate accurate citations for free.
- Knowledge Base
- Research bias
Types of Bias in Research | Definition & Examples
Research bias results from any deviation from the truth, causing distorted results and wrong conclusions. Bias can occur at any phase of your research, including during data collection , data analysis , interpretation, or publication. Research bias can occur in both qualitative and quantitative research .
Understanding research bias is important for several reasons.
- Bias exists in all research, across research designs , and is difficult to eliminate.
- Bias can occur at any stage of the research process .
- Bias impacts the validity and reliability of your findings, leading to misinterpretation of data.
It is almost impossible to conduct a study without some degree of research bias. It’s crucial for you to be aware of the potential types of bias, so you can minimize them.
For example, the success rate of the program will likely be affected if participants start to drop out ( attrition ). Participants who become disillusioned due to not losing weight may drop out, while those who succeed in losing weight are more likely to continue. This in turn may bias the findings towards more favorable results.
Table of contents
Information bias, interviewer bias.
- Publication bias
Researcher bias
Response bias.
Selection bias
Cognitive bias
How to avoid bias in research
Other types of research bias, frequently asked questions about research bias.
Information bias , also called measurement bias, arises when key study variables are inaccurately measured or classified. Information bias occurs during the data collection step and is common in research studies that involve self-reporting and retrospective data collection. It can also result from poor interviewing techniques or differing levels of recall from participants.
The main types of information bias are:
- Recall bias
- Observer bias
Performance bias
Regression to the mean (rtm).
Over a period of four weeks, you ask students to keep a journal, noting how much time they spent on their smartphones along with any symptoms like muscle twitches, aches, or fatigue.
Recall bias is a type of information bias. It occurs when respondents are asked to recall events in the past and is common in studies that involve self-reporting.
As a rule of thumb, infrequent events (e.g., buying a house or a car) will be memorable for longer periods of time than routine events (e.g., daily use of public transportation). You can reduce recall bias by running a pilot survey and carefully testing recall periods. If possible, test both shorter and longer periods, checking for differences in recall.
- A group of children who have been diagnosed, called the case group
- A group of children who have not been diagnosed, called the control group
Since the parents are being asked to recall what their children generally ate over a period of several years, there is high potential for recall bias in the case group.
The best way to reduce recall bias is by ensuring your control group will have similar levels of recall bias to your case group. Parents of children who have childhood cancer, which is a serious health problem, are likely to be quite concerned about what may have contributed to the cancer.
Thus, if asked by researchers, these parents are likely to think very hard about what their child ate or did not eat in their first years of life. Parents of children with other serious health problems (aside from cancer) are also likely to be quite concerned about any diet-related question that researchers ask about.
Observer bias is the tendency of research participants to see what they expect or want to see, rather than what is actually occurring. Observer bias can affect the results in observationa l and experimental studies, where subjective judgment (such as assessing a medical image) or measurement (such as rounding blood pressure readings up or down) is part of the d ata collection process.
Observer bias leads to over- or underestimation of true values, which in turn compromise the validity of your findings. You can reduce observer bias by using double-blinded and single-blinded research methods.
Based on discussions you had with other researchers before starting your observations , you are inclined to think that medical staff tend to simply call each other when they need specific patient details or have questions about treatments.
At the end of the observation period, you compare notes with your colleague. Your conclusion was that medical staff tend to favor phone calls when seeking information, while your colleague noted down that medical staff mostly rely on face-to-face discussions. Seeing that your expectations may have influenced your observations, you and your colleague decide to conduct semi-structured interviews with medical staff to clarify the observed events. Note: Observer bias and actor–observer bias are not the same thing.
Performance bias is unequal care between study groups. Performance bias occurs mainly in medical research experiments, if participants have knowledge of the planned intervention, therapy, or drug trial before it begins.
Studies about nutrition, exercise outcomes, or surgical interventions are very susceptible to this type of bias. It can be minimized by using blinding , which prevents participants and/or researchers from knowing who is in the control or treatment groups. If blinding is not possible, then using objective outcomes (such as hospital admission data) is the best approach.
When the subjects of an experimental study change or improve their behavior because they are aware they are being studied, this is called the Hawthorne effect (or observer effect). Similarly, the John Henry effect occurs when members of a control group are aware they are being compared to the experimental group. This causes them to alter their behavior in an effort to compensate for their perceived disadvantage.
Regression to the mean (RTM) is a statistical phenomenon that refers to the fact that a variable that shows an extreme value on its first measurement will tend to be closer to the center of its distribution on a second measurement.
Medical research is particularly sensitive to RTM. Here, interventions aimed at a group or a characteristic that is very different from the average (e.g., people with high blood pressure) will appear to be successful because of the regression to the mean. This can lead researchers to misinterpret results, describing a specific intervention as causal when the change in the extreme groups would have happened anyway.
In general, among people with depression, certain physical and mental characteristics have been observed to deviate from the population mean .
This could lead you to think that the intervention was effective when those treated showed improvement on measured post-treatment indicators, such as reduced severity of depressive episodes.
However, given that such characteristics deviate more from the population mean in people with depression than in people without depression, this improvement could be attributed to RTM.
Interviewer bias stems from the person conducting the research study. It can result from the way they ask questions or react to responses, but also from any aspect of their identity, such as their sex, ethnicity, social class, or perceived attractiveness.
Interviewer bias distorts responses, especially when the characteristics relate in some way to the research topic. Interviewer bias can also affect the interviewer’s ability to establish rapport with the interviewees, causing them to feel less comfortable giving their honest opinions about sensitive or personal topics.
Participant: “I like to solve puzzles, or sometimes do some gardening.”
You: “I love gardening, too!”
In this case, seeing your enthusiastic reaction could lead the participant to talk more about gardening.
Establishing trust between you and your interviewees is crucial in order to ensure that they feel comfortable opening up and revealing their true thoughts and feelings. At the same time, being overly empathetic can influence the responses of your interviewees, as seen above.
Publication bias occurs when the decision to publish research findings is based on their nature or the direction of their results. Studies reporting results that are perceived as positive, statistically significant , or favoring the study hypotheses are more likely to be published due to publication bias.
Publication bias is related to data dredging (also called p -hacking ), where statistical tests on a set of data are run until something statistically significant happens. As academic journals tend to prefer publishing statistically significant results, this can pressure researchers to only submit statistically significant results. P -hacking can also involve excluding participants or stopping data collection once a p value of 0.05 is reached. However, this leads to false positive results and an overrepresentation of positive results in published academic literature.
Researcher bias occurs when the researcher’s beliefs or expectations influence the research design or data collection process. Researcher bias can be deliberate (such as claiming that an intervention worked even if it didn’t) or unconscious (such as letting personal feelings, stereotypes, or assumptions influence research questions ).
The unconscious form of researcher bias is associated with the Pygmalion effect (or Rosenthal effect ), where the researcher’s high expectations (e.g., that patients assigned to a treatment group will succeed) lead to better performance and better outcomes.
Researcher bias is also sometimes called experimenter bias, but it applies to all types of investigative projects, rather than only to experimental designs .
- Good question: What are your views on alcohol consumption among your peers?
- Bad question: Do you think it’s okay for young people to drink so much?
Response bias is a general term used to describe a number of different situations where respondents tend to provide inaccurate or false answers to self-report questions, such as those asked on surveys or in structured interviews .
This happens because when people are asked a question (e.g., during an interview ), they integrate multiple sources of information to generate their responses. Because of that, any aspect of a research study may potentially bias a respondent. Examples include the phrasing of questions in surveys, how participants perceive the researcher, or the desire of the participant to please the researcher and to provide socially desirable responses.
Response bias also occurs in experimental medical research. When outcomes are based on patients’ reports, a placebo effect can occur. Here, patients report an improvement despite having received a placebo, not an active medical treatment.
While interviewing a student, you ask them:
“Do you think it’s okay to cheat on an exam?”
Common types of response bias are:
Acquiescence bias
Demand characteristics.
- Social desirability bias
Courtesy bias
- Question-order bias
Extreme responding
Acquiescence bias is the tendency of respondents to agree with a statement when faced with binary response options like “agree/disagree,” “yes/no,” or “true/false.” Acquiescence is sometimes referred to as “yea-saying.”
This type of bias occurs either due to the participant’s personality (i.e., some people are more likely to agree with statements than disagree, regardless of their content) or because participants perceive the researcher as an expert and are more inclined to agree with the statements presented to them.
Q: Are you a social person?
People who are inclined to agree with statements presented to them are at risk of selecting the first option, even if it isn’t fully supported by their lived experiences.
In order to control for acquiescence, consider tweaking your phrasing to encourage respondents to make a choice truly based on their preferences. Here’s an example:
Q: What would you prefer?
- A quiet night in
- A night out with friends
Demand characteristics are cues that could reveal the research agenda to participants, risking a change in their behaviors or views. Ensuring that participants are not aware of the research objectives is the best way to avoid this type of bias.
On each occasion, patients reported their pain as being less than prior to the operation. While at face value this seems to suggest that the operation does indeed lead to less pain, there is a demand characteristic at play. During the interviews, the researcher would unconsciously frown whenever patients reported more post-op pain. This increased the risk of patients figuring out that the researcher was hoping that the operation would have an advantageous effect.
Social desirability bias is the tendency of participants to give responses that they believe will be viewed favorably by the researcher or other participants. It often affects studies that focus on sensitive topics, such as alcohol consumption or sexual behavior.
You are conducting face-to-face semi-structured interviews with a number of employees from different departments. When asked whether they would be interested in a smoking cessation program, there was widespread enthusiasm for the idea.
Note that while social desirability and demand characteristics may sound similar, there is a key difference between them. Social desirability is about conforming to social norms, while demand characteristics revolve around the purpose of the research.
Courtesy bias stems from a reluctance to give negative feedback, so as to be polite to the person asking the question. Small-group interviewing where participants relate in some way to each other (e.g., a student, a teacher, and a dean) is especially prone to this type of bias.
Question order bias
Question order bias occurs when the order in which interview questions are asked influences the way the respondent interprets and evaluates them. This occurs especially when previous questions provide context for subsequent questions.
When answering subsequent questions, respondents may orient their answers to previous questions (called a halo effect ), which can lead to systematic distortion of the responses.
Extreme responding is the tendency of a respondent to answer in the extreme, choosing the lowest or highest response available, even if that is not their true opinion. Extreme responding is common in surveys using Likert scales , and it distorts people’s true attitudes and opinions.
Disposition towards the survey can be a source of extreme responding, as well as cultural components. For example, people coming from collectivist cultures tend to exhibit extreme responses in terms of agreement, while respondents indifferent to the questions asked may exhibit extreme responses in terms of disagreement.
Selection bias is a general term describing situations where bias is introduced into the research from factors affecting the study population.
Common types of selection bias are:
Sampling or ascertainment bias
- Attrition bias
- Self-selection (or volunteer) bias
- Survivorship bias
- Nonresponse bias
- Undercoverage bias
Sampling bias occurs when your sample (the individuals, groups, or data you obtain for your research) is selected in a way that is not representative of the population you are analyzing. Sampling bias threatens the external validity of your findings and influences the generalizability of your results.
The easiest way to prevent sampling bias is to use a probability sampling method . This way, each member of the population you are studying has an equal chance of being included in your sample.
Sampling bias is often referred to as ascertainment bias in the medical field.
Attrition bias occurs when participants who drop out of a study systematically differ from those who remain in the study. Attrition bias is especially problematic in randomized controlled trials for medical research because participants who do not like the experience or have unwanted side effects can drop out and affect your results.
You can minimize attrition bias by offering incentives for participants to complete the study (e.g., a gift card if they successfully attend every session). It’s also a good practice to recruit more participants than you need, or minimize the number of follow-up sessions or questions.
You provide a treatment group with weekly one-hour sessions over a two-month period, while a control group attends sessions on an unrelated topic. You complete five waves of data collection to compare outcomes: a pretest survey, three surveys during the program, and a posttest survey.
Self-selection or volunteer bias
Self-selection bias (also called volunteer bias ) occurs when individuals who volunteer for a study have particular characteristics that matter for the purposes of the study.
Volunteer bias leads to biased data, as the respondents who choose to participate will not represent your entire target population. You can avoid this type of bias by using random assignment —i.e., placing participants in a control group or a treatment group after they have volunteered to participate in the study.
Closely related to volunteer bias is nonresponse bias , which occurs when a research subject declines to participate in a particular study or drops out before the study’s completion.
Considering that the hospital is located in an affluent part of the city, volunteers are more likely to have a higher socioeconomic standing, higher education, and better nutrition than the general population.
Survivorship bias occurs when you do not evaluate your data set in its entirety: for example, by only analyzing the patients who survived a clinical trial.
This strongly increases the likelihood that you draw (incorrect) conclusions based upon those who have passed some sort of selection process—focusing on “survivors” and forgetting those who went through a similar process and did not survive.
Note that “survival” does not always mean that participants died! Rather, it signifies that participants did not successfully complete the intervention.
However, most college dropouts do not become billionaires. In fact, there are many more aspiring entrepreneurs who dropped out of college to start companies and failed than succeeded.
Nonresponse bias occurs when those who do not respond to a survey or research project are different from those who do in ways that are critical to the goals of the research. This is very common in survey research, when participants are unable or unwilling to participate due to factors like lack of the necessary skills, lack of time, or guilt or shame related to the topic.
You can mitigate nonresponse bias by offering the survey in different formats (e.g., an online survey, but also a paper version sent via post), ensuring confidentiality , and sending them reminders to complete the survey.
You notice that your surveys were conducted during business hours, when the working-age residents were less likely to be home.
Undercoverage bias occurs when you only sample from a subset of the population you are interested in. Online surveys can be particularly susceptible to undercoverage bias. Despite being more cost-effective than other methods, they can introduce undercoverage bias as a result of excluding people who do not use the internet.
Cognitive bias refers to a set of predictable (i.e., nonrandom) errors in thinking that arise from our limited ability to process information objectively. Rather, our judgment is influenced by our values, memories, and other personal traits. These create “ mental shortcuts” that help us process information intuitively and decide faster. However, cognitive bias can also cause us to misunderstand or misinterpret situations, information, or other people.
Because of cognitive bias, people often perceive events to be more predictable after they happen.
Although there is no general agreement on how many types of cognitive bias exist, some common types are:
- Anchoring bias
- Framing effect
- Actor-observer bias
- Availability heuristic (or availability bias)
- Confirmation bias
- Halo effect
- The Baader-Meinhof phenomenon
Anchoring bias
Anchoring bias is people’s tendency to fixate on the first piece of information they receive, especially when it concerns numbers. This piece of information becomes a reference point or anchor. Because of that, people base all subsequent decisions on this anchor. For example, initial offers have a stronger influence on the outcome of negotiations than subsequent ones.
- Framing effect
Framing effect refers to our tendency to decide based on how the information about the decision is presented to us. In other words, our response depends on whether the option is presented in a negative or positive light, e.g., gain or loss, reward or punishment, etc. This means that the same information can be more or less attractive depending on the wording or what features are highlighted.
Actor–observer bias
Actor–observer bias occurs when you attribute the behavior of others to internal factors, like skill or personality, but attribute your own behavior to external or situational factors.
In other words, when you are the actor in a situation, you are more likely to link events to external factors, such as your surroundings or environment. However, when you are observing the behavior of others, you are more likely to associate behavior with their personality, nature, or temperament.
One interviewee recalls a morning when it was raining heavily. They were rushing to drop off their kids at school in order to get to work on time. As they were driving down the highway, another car cut them off as they were trying to merge. They tell you how frustrated they felt and exclaim that the other driver must have been a very rude person.
At another point, the same interviewee recalls that they did something similar: accidentally cutting off another driver while trying to take the correct exit. However, this time, the interviewee claimed that they always drive very carefully, blaming their mistake on poor visibility due to the rain.
- Availability heuristic
Availability heuristic (or availability bias) describes the tendency to evaluate a topic using the information we can quickly recall to our mind, i.e., that is available to us. However, this is not necessarily the best information, rather it’s the most vivid or recent. Even so, due to this mental shortcut, we tend to think that what we can recall must be right and ignore any other information.
- Confirmation bias
Confirmation bias is the tendency to seek out information in a way that supports our existing beliefs while also rejecting any information that contradicts those beliefs. Confirmation bias is often unintentional but still results in skewed results and poor decision-making.
Let’s say you grew up with a parent in the military. Chances are that you have a lot of complex emotions around overseas deployments. This can lead you to over-emphasize findings that “prove” that your lived experience is the case for most families, neglecting other explanations and experiences.
The halo effect refers to situations whereby our general impression about a person, a brand, or a product is shaped by a single trait. It happens, for instance, when we automatically make positive assumptions about people based on something positive we notice, while in reality, we know little about them.
The Baader-Meinhof phenomenon
The Baader-Meinhof phenomenon (or frequency illusion) occurs when something that you recently learned seems to appear “everywhere” soon after it was first brought to your attention. However, this is not the case. What has increased is your awareness of something, such as a new word or an old song you never knew existed, not their frequency.
While very difficult to eliminate entirely, research bias can be mitigated through proper study design and implementation. Here are some tips to keep in mind as you get started.
- Clearly explain in your methodology section how your research design will help you meet the research objectives and why this is the most appropriate research design.
- In quantitative studies , make sure that you use probability sampling to select the participants. If you’re running an experiment, make sure you use random assignment to assign your control and treatment groups.
- Account for participants who withdraw or are lost to follow-up during the study. If they are withdrawing for a particular reason, it could bias your results. This applies especially to longer-term or longitudinal studies .
- Use triangulation to enhance the validity and credibility of your findings.
- Phrase your survey or interview questions in a neutral, non-judgmental tone. Be very careful that your questions do not steer your participants in any particular direction.
- Consider using a reflexive journal. Here, you can log the details of each interview , paying special attention to any influence you may have had on participants. You can include these in your final analysis.
- Baader–Meinhof phenomenon
- Sampling bias
- Ascertainment bias
- Self-selection bias
- Hawthorne effect
- Omitted variable bias
- Pygmalion effect
- Placebo effect
Research bias affects the validity and reliability of your research findings , leading to false conclusions and a misinterpretation of the truth. This can have serious implications in areas like medical research where, for example, a new form of treatment may be evaluated.
Observer bias occurs when the researcher’s assumptions, views, or preconceptions influence what they see and record in a study, while actor–observer bias refers to situations where respondents attribute internal factors (e.g., bad character) to justify other’s behavior and external factors (difficult circumstances) to justify the same behavior in themselves.
Response bias is a general term used to describe a number of different conditions or factors that cue respondents to provide inaccurate or false answers during surveys or interviews. These factors range from the interviewer’s perceived social position or appearance to the the phrasing of questions in surveys.
Nonresponse bias occurs when the people who complete a survey are different from those who did not, in ways that are relevant to the research topic. Nonresponse can happen because people are either not willing or not able to participate.
Is this article helpful?
Other students also liked.
- Attrition Bias | Examples, Explanation, Prevention
- Observer Bias | Definition, Examples, Prevention
- What Is Social Desirability Bias? | Definition & Examples
More interesting articles
- Demand Characteristics | Definition, Examples & Control
- Hostile Attribution Bias | Definition & Examples
- Regression to the Mean | Definition & Examples
- Representativeness Heuristic | Example & Definition
- Sampling Bias and How to Avoid It | Types & Examples
- Self-Fulfilling Prophecy | Definition & Examples
- The Availability Heuristic | Example & Definition
- The Baader–Meinhof Phenomenon Explained
- What Is a Ceiling Effect? | Definition & Examples
- What Is Actor-Observer Bias? | Definition & Examples
- What Is Affinity Bias? | Definition & Examples
- What Is Anchoring Bias? | Definition & Examples
- What Is Ascertainment Bias? | Definition & Examples
- What Is Belief Bias? | Definition & Examples
- What Is Bias for Action? | Definition & Examples
- What Is Cognitive Bias? | Definition, Types, & Examples
- What Is Confirmation Bias? | Definition & Examples
- What Is Conformity Bias? | Definition & Examples
- What Is Correspondence Bias? | Definition & Example
- What Is Explicit Bias? | Definition & Examples
- What Is Generalizability? | Definition & Examples
- What Is Hindsight Bias? | Definition & Examples
- What Is Implicit Bias? | Definition & Examples
- What Is Information Bias? | Definition & Examples
- What Is Ingroup Bias? | Definition & Examples
- What Is Negativity Bias? | Definition & Examples
- What Is Nonresponse Bias? | Definition & Example
- What Is Normalcy Bias? | Definition & Example
- What Is Omitted Variable Bias? | Definition & Examples
- What Is Optimism Bias? | Definition & Examples
- What Is Outgroup Bias? | Definition & Examples
- What Is Overconfidence Bias? | Definition & Examples
- What Is Perception Bias? | Definition & Examples
- What Is Primacy Bias? | Definition & Example
- What Is Publication Bias? | Definition & Examples
- What Is Recall Bias? | Definition & Examples
- What Is Recency Bias? | Definition & Examples
- What Is Response Bias? | Definition & Examples
- What Is Selection Bias? | Definition & Examples
- What Is Self-Selection Bias? | Definition & Example
- What Is Self-Serving Bias? | Definition & Example
- What Is Status Quo Bias? | Definition & Examples
- What Is Survivorship Bias? | Definition & Examples
- What Is the Affect Heuristic? | Example & Definition
- What Is the Egocentric Bias? | Definition & Examples
- What Is the Framing Effect? | Definition & Examples
- What Is the Halo Effect? | Definition & Examples
- What Is the Hawthorne Effect? | Definition & Examples
- What Is the Placebo Effect? | Definition & Examples
- What Is the Pygmalion Effect? | Definition & Examples
- What Is Unconscious Bias? | Definition & Examples
- What Is Undercoverage Bias? | Definition & Example
- What Is Vividness Bias? | Definition & Examples
There are no boundaries to what you can achieve with a degree from Arts & Sciences.
Apply Today
- Arts & Sciences
- Graduate Studies in A&S
Understanding your biases
Two WashU researchers who conduct studies on bias and its impacts, Calvin Lai and Clara Wilkins, explain the roots and consequences of bias and how we can potentially reduce it.
If there is one thing you need to know about biases, it is that you have them.
When we see the word “bias” in the news, it is usually in connection with a terrible injustice, like someone being passed over for a job, or worse, targeted by law enforcement because of their gender, race, or nationality. We tend to think of people who behave in biased ways as bad people who take extreme actions to exclude others. No one wants to admit to being biased.
According to researchers in psychological and brain sciences, however, biases are often at least partly unconscious. Despite this, they profoundly impact way we interact with the world and tend to perpetuate much of the inequality that exists in our society.
The basics: What is bias?
If we want to decrease harmful biases, we need to first understand what bias is. Clara Wilkins, assistant professor of psychological and brain sciences, says that when she teaches bias in the classroom, she breaks it down into three components that are often referred to as the “ABCs” of bias. The “A,” or affective component, is what we would call prejudice, or negative feelings toward a person that are based on his or her group membership, the “C” or cognitive component is stereotypes, or generalizations about a group, and the “B,” or behavioral component, is discrimination, or the actual actions taken against a person based on their group membership. Wilkins’ Social Perception and Intergroup Attitudes Lab is interested in studying all of these components of bias.
Calvin Lai, assistant professor of psychological and brain sciences, says that although the bias we hear about in the news is usually harmful, bias itself is not always negative. He says that, “the way that psychological scientists define bias is just a tendency to respond one way compared to another when making some kind of a life choice.” Sometimes these biases can be completely neutral, like a bias for Coke over Pepsi, and can even be helpful in allowing you to make decisions more rapidly.
Not all biases are so harmless, however. As Lai notes, “Bias can often lead us in directions that we don’t expect, that we don’t intend, and that we might even disagree with if we knew that it was nudging us in a particular way.” These are the kinds of biases that can be harmful when people allow them to impact their behavior toward certain groups, and the biases that his Diversity Science Lab is attempting to redirect through their research.
Wilkins states that most people are hesitant to see themselves as participating in bias, but that we need to be aware that we can behave in harmful ways, even if we consciously support equality. She says, “Good people also exhibit bias. I think if we have this image of a racist person as a member of the KKK who does something really really violent, that is going to exclude a lot of acts that actually reinforce social inequality.” Understanding that even “good” people can be biased allows us to be more open to exploring our own biases and take actions to prevent acting on them.
“Bias can often lead us in directions that we don’t expect, that we don’t intend."
Studying unconscious biases
Because so many people are reluctant to admit, and are often even unaware of, their biases, it is difficult for researchers to learn what biases the participants they are studying hold. To counter this problem, researchers have developed something called the Implicit Association Test.
Harvard first developed the Implicit Association Test in 1998 to test peoples’ implicit biases by looking at how strongly they associate different concepts with different groups of people. Lai is the Director of Research at Project Implicit, a non-profit that uses online IATs both to collect research data and to help inform the general public about biases, and says that the vast amount of data collected through these tests over the last two decades has allowed researchers to track biases and see how certain demographic factors, including a person’s location, age, and race, can impact their biases.
IATs have consistently shown that people are faster to associate white people with good things and black people with bad things than vice versa, which demonstrates how these pairings are subconsciously linked together in their memories. Researchers have also developed different IATs to test for the associations participants make on the basis of gender, religion, weight, sexuality, age, and a host of other identity categories. If you want to see what one of these tests is like, you can take one yourself at Project Implicit .
Consequences of bias
IATs have a lot to tell us about the possible prevalence and consequences of bias. Lai states that there is a correlation between how people perform on IATs and the way they behave toward different groups. He states, “We do find that these implicit biases do correlate with how people act, and they often do so over and above what people can report, what they can actually say about themselves.” He has worked with a number of different populations to help them understand their biases better, and how those biases can lead to certain groups being treated differently in healthcare, academia, and the job market.
Biases have the potential to do the most harm when they are acted on by people in positions of relative power, whether they be healthcare professionals, employers, or law enforcement officers. We have all heard in the news of how bias can even lead to deadly encounters in situations in which people have to make snap judgments about the risk a person poses. In one of his current projects, Lai is working in collaboration with the Anti-Defamation League to help police officers better understand their biases. His research will help determine if an educational workshop on biases can impact the way law enforcement interacts with different populations.
Wilkins also studies how bias manifests in groups with power differentials. Wilkins has found that groups that believe that the current hierarchy is fair tend to double-down on these beliefs and behave in a more discriminatory way when they feel that this hierarchy is being threatened. In a recent study, Wilkins found that men who held status-legitimizing beliefs were more likely to penalize women when reviewing job resumes after being exposed to an article about men being discriminated against. This finding is particularly troubling because she has also found evidence that men have been perceiving men to be the victims of discrimination more often in recent years, which means that these reactionary behaviors against women might also be increasing.
"It is sort of ironic, your idea about fairness actually leads you to behave in an unfair way."
Wilkins explains status-legitimizing beliefs by saying, “Society is structured where some groups have better access to resources than others, so they have more income, wealth, power, etc. than other groups. Status-legitimizing ideologies are ideologies that make that inequality seem fair and legitimate.” She used the idea of meritocracy as an example of a status-legitimizing belief, saying that if people believe that hard work always results in success, they will be likely to see people who are struggling financially as simply not working hard enough and overlook structural inequalities in things like access to education and healthcare. She says, “It is sort of ironic, your idea about fairness actually leads you to behave in an unfair way.”
Wilkins says that opposition to affirmative action is an example of the way status-legitimizing beliefs can make it difficult for people to acknowledge structural inequalities like the ones that were illuminated with the recent admissions scandal involving wealthy parents. “There are a lot of people who are opposed to affirmative action because they think it disadvantages people who are not racial minorities, when there are other structural things like donations or legacy admissions or other things that aren’t based just on merit that disadvantage particular groups.”
Wilkins says that some of the backlash we witnessed after Obama’s presidency is a result of perceived threats to the status-quo that causes groups with power, like white men, to behave negatively toward groups they see as potentially disrupting traditional power dynamics. This reaction might be caused by zero-sum beliefs that see increases in rights/advantages for one group as automatically decreasing advantages for another. These beliefs are often so deeply held that people might not even consciously recognize that they have them, but the can significantly impact behavior.
Avoiding biased actions
So how do we avoid being biased? When it comes to changing your implicit unconscious biases, like the ones the IAT tests for, research has consistently shown that it is more difficult than you would think. Lai says, “It does seem that trying to change implicit bias right now directly, trying to reduce these associations that swirl around in our head, that seems really hard. They are built up over a lifetime of experience and it seems that if you can change them, it requires a lot of sustained effort.” In other words, a quick diversity training, while potentially helpful in getting people to start thinking about their biases, is not going to immediately change the way their brains associate white people with good things and black people with negative things.
Wilkins similarly says that she does not believe that progress toward a less biased world is linear. As her research has shown, the societal changes that we might see as progress are often accompanied by backlash when threats to the established order cause people to double down on their biases, whether consciously or unconsciously.
In spite of these somewhat bleak findings, however, both Lai and Wilkins are optimistic that there are things that we can do to reduce biased actions, even if we can’t completely eliminate biased thoughts. Recently, Wilkins has been researching ways to reduce people’s zero-sum beliefs. She is currently working on a study in which she has Christians read a Bible verse that promotes tolerance to illustrate that acceptance of different groups, like LGBTQ individuals, is not incompatible with Christian values. So far she has found that exposing participants to this verse decreases zero-sum beliefs and increases tolerance. Although she does not yet know how permanent these changes will prove to be, she is taking it as a hopeful sign.
For readers who want to behave with less bias, Lai and Wilkins both say that being aware of your bias is the first step. Lai argued for a systematic approach to tracking our own biases: “I think the big thing is, we’re all susceptible to bias. It’s really important to just keep records, track, in your own life where it might be happening, and then design good procedures or habits so you don’t act on them.” He says that there are simple things that people can do to avoid letting their biases influence decisions, like blanking out names on resumes so that an applicant’s gender or racial identity can’t influence hiring decisions.
Wilkins also says that we should be more aware of why we are drawn to certain people over others, and to go out of our way to avoid acting on these biases. “None of it is an easy solution,” she says, “I think it also requires a lot of motivation… It is not enough to not be racist or sexist. There need to be anti-racist and anti-sexist efforts because these behaviors are so entrenched in our society that it will be difficult to make real, sustained progress.”
IMAGES
VIDEO
COMMENTS
How to mitigate bias in the classroom. We can hypothesize how issues similar to those above can arise in the classroom when evaluating student performance on more subjective tasks (e.g., awarding points for class discussion, open-ended writing assignments or projects) or when assigning student grades at the end of the term when a student is close to a letter grade threshold (A/B, B/C, etc.).
counterparts. Implicit biases that underpin these situations put students of color at a disadvantage in education that has lifelong. consequences.Research also shows that implicit bias is pervasive in STEM courses, where instructors and students may carry assumptions or hold stereotypes in mind that are not supportive of an inclusive teachin.
Module 1: Understanding Implicit Bias. Content not loaded. In this first module, you will be introduced to the basics of implicit bias: What is meant by "implicit" or "unconscious" associations or biases? How are implicit biases different from explicit biases? What does it look and/or feel like when implicit associations operate?
After all, most Americans oppose explicit discrimination based on race, nationality, gender, or religion. American ideals of equal opportunity are enshrined in the U.S. Constitution and reinforced by laws like the Civil Rights Act and Title IX. Yet, women and minorities continue to experience inequity in employment, political representation ...
Overview. The polarization of American society means almost every topic is ripe for controversy.1 Students in first year writing classes reflect this noisy infor-mation ecosystem, commonly, by focusing on the degree of bias an author displays. In some cases, these observations result in savvy choices about source credibility, but in other ...
Explore and identify your own implicit biases by taking implicit association tests or through other means. Practice ways to reduce stress and increase mindfulness, such as meditation, yoga, or ...
Read: Watch: Do: Read the Unconscious Bias article to gain a better of understanding of unconscious bias and how you can make an effort to prevent your biases from affecting your decisions. For a more detailed look at the types of cognitive biases, read 12 Cognitive Biases That Prevent You From Being Rational.For a more simplified chart of the types of cognitive bias, take a look at 20 ...
Explicit Bias. Definition. Unconscious attitudes or stereotypes that affect our understanding, actions, and decisions. Conscious beliefs and attitudes about a person or group. How it manifests. Can influence decisions and behavior subconsciously. Usually apparent in a person's language and behavior. Example.
Research suggests that implicit bias shapes both instructor-student and student-student interactions in the classroom, with outcomes such as: ... (2015-16, Winter). Understanding implicit bias: What educators should know. American Educator, 39(4): 29-33, 43. This accessible article summarizes educational research on implicit bias and offers ...
There are three main purposes of these assignments: To help you understand the societal impact of implicit bias and remember what you've learned. To facilitate personal reflection on how implicit bias has influenced your life. To provide you with a comprehensive set of notes to complete your final assignment, an essay.
Identifying your biases is a very important part of the self-awareness journey that leads us to be more emotionally intelligent human beings, as well as better business leaders. Biases have been ...
Bias and a lack of cultural competency are often cited interchangeably as challenges in police-community relationships. While bias and a lack of cultural competency may both be present in a given situation, these challenges and the strategies for addressing them differ appreciably. This resource guide will assist readers in understanding and ...
He says that, "the way that psychological scientists define bias is just a tendency to respond one way compared to another when making some kind of a life choice.". Sometimes these biases can be completely neutral, like a bias for Coke over Pepsi, and can even be helpful in allowing you to make decisions more rapidly. Calvin Lai.
This course will introduce you to insights about how our minds operate and help you understand the origins of implicit associations. You will also uncover some of your own biases and learn strategies for addressing them. Each module is divided into a short series of lessons, many taking less than 10 minutes to complete.
How to Identify Bias: 14 Types of Bias. Last updated: Jun 7, 2021 • 5 min read. Understanding your biases and assumptions is crucial to clear thinking and scientific literacy. All of us, no matter our education, intellectual commitment, or good intentions, are susceptible to biases.
Unconscious Bias Training That Works. Increasing awareness isn't enough. Teach people to manage their biases, change their behavior, and track their progress. by Francesca Gino and Katherine ...
An implicit bias is an unconscious association, belief, or attitude toward any social group. Implicit biases are one reason why people often attribute certain qualities or characteristics to all members of a particular group, a phenomenon known as stereotyping. It is important to remember that implicit biases operate almost entirely on an ...
Definition of Assignment Bias. Assignment bias refers to a type of bias that occurs in research or experimental studies when the assignment of participants to different groups or conditions is not randomized or is influenced by external factors.. Understanding Assignment Bias. Randomized assignment or allocation of participants to different groups is a fundamental principle in research studies ...
Objectives. By the end of this lesson, students will be able to: Analyze news sources and determine the biases represented in them. Understand the different forms of bias. Communicate respectfully and critically and question the perspectives represented in online sources.
CU Boulder's Division of Public Safety is continuing its commitment to learning and growth, tackling the root causes of inequities and bias in a workshop by Darrell Hammond Sr. of Higher Ground Consulting, a Boulder County-based leadership and professional development organization. "I'm continually impressed by this group's investment in their culture and their willingness to have ...
Understanding research bias is important for several reasons. Bias exists in all research, across research designs, ... If you're running an experiment, make sure you use random assignment to assign your control and treatment groups. Account for participants who withdraw or are lost to follow-up during the study. If they are withdrawing for a ...
He says that, "the way that psychological scientists define bias is just a tendency to respond one way compared to another when making some kind of a life choice.". Sometimes these biases can be completely neutral, like a bias for Coke over Pepsi, and can even be helpful in allowing you to make decisions more rapidly. Calvin Lai.