US government study finds racial, gender bias in facial recognition tools

US government study finds racial, gender bias in facial recognition tools

A major study of 189 widely-used facial recognition algorithms by a U.S. agency found that they were substantially less accurate identifying people of color than white people, especially black women. The study by the National Institute of Standards and Technology found that algorithms falsely identified Black and Asian faces 10 to 100 times more than Caucasian faces.

Skeptic
Skeptic
Wino-wisdom
Wino-wisdom 2 months

Normal people: Probably cos their faces are darker and its harder to pick up certain deep lines in the face correctly? Maybe some more R&D is in order to resolve the issue and make an even better product. Libtards: Its rasict

Maesterfully
Maesterfully 2 months

In other news monkey wrenches will now be called people wrenches to avoid any racial overtones

Rudolf Nechvile
Rudolf Nechvile 2 months

Instead of crying racism where there is none, have people considered that it's perhaps better to not be easily picked up by tools with a high authoritarian potential?

Top in World
Get the App