Object 2: The amazon face recognition software

May 28, 2021

Object+2%3A+The+amazon+face+recognition+software

The “Amazon face Rekognition3” software is widely applied such as to identify a person or evaluate resumes. This software showed bias towards certain ethnicities and genders. The graphic shows the AI will be more biased towards darker females with a 68.6%4 accuracy.

This object demonstrates how bias is inevitable in the production of knowledge, since the diversity in the data fed into the algorithm impacts the output created. The engineers at Amazon used AI to evaluate resumes. The AI rejected 100% of female applicants and preferred lighter males. This is due to amazon’s lack of training data on black and female profiles as compared to that on white men meaning the algorithm was better trained, thus less bias when evaluating white men’s resumes. Therefore, the production of biased knowledge from algorithms is inevitable because of the lack of diversity in the data.

In addition to the biased data, the creators of AI algorithms are homogeneous themselves. Therefore, they reflect their own bias, when creating their programs. For instance, 70.7% of the senior leaders in amazon are white with an average salary of $25,081,6005. This profile doesn’t match the average applicant; thus, their circumstances might not be taken into consideration by the algorithm due to the creator’s bias. Therefore, the algorithm creates biased knowledge by deciding whether an applicant has potential, based on their similarity to the creator’s race, gender and income. Algorithmic bias is inevitable as it is a reflection of the creator’s perspective which is implicitly biased.

Leave a Comment

If you want a picture to show with your comment, go get a gravatar.




Marymount Messenger • Copyright 2021 • FLEX WordPress Theme by SNOLog in