Matt from ESI Survival Guide and The Data Diva herself – Debbie Reynolds – discuss the discriminatory impact that many facial recognition technologies can have on people of color (and especially black women) due to the inherent racial and gender bias in their underlying algorithms.
Debbie begins by highlighting a paper co-authored by Dr. Timnit Gebru, computer scientist, advocate for diversity in technology, and co-founder of Black in A.I.; and Joy Buolamwini, computer scientist, digital activist, PhD candidate at the MIT Media Lab, founder of the Algorithmic Justice League and the force behind the documentary Coded Bias now on Netflix.
Their groundbreaking paper, entitled Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, unpacks the bias present in automated facial analysis algorithms based on gender and skin color. You can read it here: http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf
The Gender Shades paper focuses on how certain facial recognition programs leverage a six-point color chart that presents color in a very similar and narrow manner. Ultimately, the error rate in the misidentification of people of color, and in particular black women, is significantly higher than that of white males. Debbie also expresses her concerns with technologies in the health care space that use facial recognition or skin color recognition to diagnose conditions such as skin cancer, and the potential for misdiagnoses.
We go on to discuss how problematic it is that the sale of such technology is extremely unregulated, as well as the possibilities for abuse or misuse in the law enforcement context.
Witness bias and the error rates of these tools, combined with the lack of oversight with how such technology is onboarded and used by law enforcement, create a very troubling situation that must be addressed immediately. At the most basic level, a solution is two-fold: (1) limit the way the technology can be used, and (2) improve the technology itself.
CONSIDER AND COMMENT: What are other ways in which technology impact issues of race and gender. Feel free to comment with your thoughts, ideas and experience.
#ESISurvivalGuide #TheDataDiva #DebbieReynolds #FacialRecognitionBias #AlgorithmicBias #Biometrics
For the full extended interview and more content visit www.esisurvivalguide.com.
Stay safe out there in the electronic wilderness!
DISCLAIMER: The information provided on this channel does not, and is not intended to, constitute legal advice; instead, all information, content, and materials available on this site are for general informational purposes only. Information on this channel may not constitute the most up-to-date legal or other information. This channel contains links to other third-party websites. Such links are only for the convenience of the reader, user or browser; ESI Survival Guide, its content providers, interviewers and interviewees, and any individual providing content on this site, do not recommend or endorse the contents of the third-party sites. For the full text of the disclaimer please click here – https://esisurvivalguide.com/disclaimer/