Now, facial recognition tools accused of racial bias

Now, facial recognition tools accused of racial bias

As per a US government study released on Thursday, many facial recognition systems misidentify people of color more often than white people. The study by the National Institute of Standards and Technology (NIST) found that, when conducting ’one-to-one’ matching, many facial recognition algorithms falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces.

Sandra
Sandra
MoralKombat
MoralKombat 2 months

Remember this next time someone claims a group of people look the same to them, even machines struggle with this.

Mutatis
Mutatis 2 months

I am not sure if 'bias' is the correct word to characterize this issue.

Paul C
Paul C 2 months

Ban racist computer algorithms!

Dave
Dave 2 months

damn white people and their *shuffles cards* facial recognition software.

Jack Donnager
Jack Donnager 2 months

Light and shadows are racist

Wholly
Wholly 2 months

Dark Privilege?

Leo Miggel
Leo Miggel 2 months

Shouldn't they be happy that only white people can be recognized by security systems? lol

Miles O'Brien
Miles O'Brien 2 months

This has always been an issue. False positives are in the 50% range for POC.

Daniel McEwen
Daniel McEwen 2 months

China surely perfected this in order for social credit to work. We obviously need to have more of what they have.

Top in Tech
Get the App