Understanding Racial and Gender Bias in AI and How to Avoid It in Your Designs and Design Education

How biases are present in our design processes and design tools

Sarah Pagliaccio
Adjunct Professor
Lesley University
College of Art and Design
Brandeis University

We are using artificial intelligence-enabled software every day when we post to social media, take pictures, and ask our phones for directions. But these apps are not designed to serve everyone equally. Institutional bias is present in the tools that we hold in our hands and place on our kitchen counters. Recent research on and disclosures by the tech giants have revealed that voice and facial-recognition apps are optimized for white, male voices and faces, respectively. The data used to “teach” the algorithms learned from the historical training data that women belong in the home and black men belong in jail. All of this leads to bias where we least want to see it—in courtrooms, in classrooms1, in elections, in our social-media feeds, in our digital assistants, and in our design tools. (Meanwhile, women drive 75-95% of purchasing decisions in the US and we are rapidly becoming a majority-minority nation.)

In this presentation, we will review some of the most egregious recent examples of AI-driven racism and sexism and take a look at some less well-known examples, including the changes to Twitter’s algorithm that favored white faces over black faces; the MSN robot editor that confused the faces of mixed-race celebrities; AI assistants that screen job applications and immigration applications; voice-recognition apps in cars that don’t understand women drivers, voice-activated apps that assist disabled veterans; and predictive-text software that could exacerbate hate speech. We will explore how these biases are present in our design processes and design tools, specifically those that use speech, image, and name generators.

Finally, we will review options for confronting these biases—like taking the implicit bias test, knowing the flaws in underlying data sources we rely on, expanding our user research to include diverse audiences, and using text sentiment analysis to remove our own bias from interview scripts, among other options—so we do not perpetuate gender and racial bias in our design solutions and design education.

Notes and References

  1. A group of 68 white, elementary-school teachers listened to and rated a group of white and black students for personality, quality of response to a prompt, and current and future academic abilities. The white teachers uniformly rated white students higher than black students; black, English-speaking students; and students with low physical attractiveness. The researchers concluded that some of these children’s academic failures might be based on their race and dialect rather than their actual performance. (Indicating that no one is immune from cultural biases, the duo who performed this research labeled the nonblack, English dialect that the white students spoke Standard English rather than White English.) See DeMeis, Debra Kanai, and Ralph R. Turner, “Effects of Students’ Race, Physical Attractiveness, and Dialect on Teachers’ Evaluations.” Contemporary Educational Psychology, Vol. 3, No. 1, January 1978.

This research was presented at the Design Incubation Colloquium 7.3: Florida Atlantic University on Saturday, April 10, 2021.