![]() ![]() That increases the risk of false identification of women and minorities. For dark-skinned women, accuracy dropped to just 35 percent. In February of 2020, Joy Buolamwini at the Massachusetts Institute of Technology found that three of the latest gender-recognition AIs, from IBM Microsoft and the Chinese company Megvii, could correctly identify a person’s gender from a photograph 99 percent of the time – but only for white men. For example facial recognition software is increasingly being used in law enforcement – and is another potential source of both race and gender bias. However, today it is harder to hide behind mistakes. Same type of problem.Įxamples above are a bit older. The cameraproblem is ten years old, but the problem still exist. Next, the Nikon Coolpix S630, a camera that when making pictures of Asian faces often asked: "did someone blink?". Testing with a person with a dark skin was obviously not taken into consideration by the design team. We start with, an automatic soap dispenser that does not recognise black skin (video 1 minute). Mistakesīelow are some examples that were designed without taking race into consideration. But the intention is to give you an impression of the problem. We are not always sure whether the consequences of the technology have been consciously or unconsciously designed that way. Note: The format of the examples below is arbitrary. Let's discuss some examples to paint a picture. This can be the result of ignorance, conscious design choices, mistakes, or carelessness. Are they racist? In many cases, yes.Īs a result, technology can have negative consequences for people of a certain race. It concerns the people who design, program or use the technology. Technology in itself is, of course, not racist. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |