Facial recognition tools misidentify people of color more often: US study

Facial recognition tools misidentify people of color more often: US study

A study found facial-recognition programs misidentify people of color more frequently than white people. It says that when doing a particular type of database searching, many facial recognition algorithms falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces. Such ’demographic differentials’ make minorities more vulnerable to false accusations, it suggests.

Met Man
Met Man 3 months

No shit. White people are the most diverse on the planet. They have many colours of hair, eyes, skin tone etc. Nearly all other races have black hair and brown eyes that makes it much harder to differentiate. The stereotype that they all look alike has like most stereotypes a basis in truth.

Based Haole
Based Haole 3 months

Ai is rayciss

IvoryDove 3 months

"People with less contrasting colors" was to hard to say often, so they just went with "people of color".

Random Bit
Random Bit 3 months

Probably cause old white male programmers programmed the ai logics and they never considered black people (or thought another team member would upload prison photo database)? Yeah, I’m joking!

..... 3 months

It just proves that they really do all look alike!

Unkars Thug
Unkars Thug 3 months

Probably because, as they are the minority, it has seen less of them when in training. Probably just needs more training.

Minor_Complex 3 months

The computers are racist, too.

Top in Tech
Get the App