Sunday, December 22, 2024

Facial Recognition Software Has A Bias Towards White Men

New research from MIT reinforces the idea that facial recognition software is subject to biases due to the data sets provided, and the way in which these algorithms are created.

The data base for this study was built by Jay Buolamwini, a researcher from MIT Media Lab, who used 1,270 faces from politicians from around the world.

The results from the facial recognition software showed inaccuracies in gender identification dependent on the person’s skin color. According to The Verge, gender was misidentified in less than one percent of lighter-skinned males, in up to seven percent of lighter-skinned females, in up to 12 percent in darker-skinned males, and in up to 35 percent in darker-skinned females.

Buolamwini concluded that male subjects were more accurately classified than female ones, and that lighter-skinned subjects were more accurately classified than darker-skinned subjects. The subjects that fared the worst were darker-skinned females.

Since computer vision technology is being utilized in high-stakes sectors such as healthcare and law enforcement, more work needs to be done in benchmarking vision algorithms for various demographic and phenotypic groups.

This is not the first time that results like this have occurred, highlighting the need for more diverse data sets and more diverse people who create these technologies, so that there’s less opportunity for these biases to sneak in. This is particularly important in the law enforcement area, where biases could result in the incrimination of innocent people.

MUST READ

Ready For Green And Blackout Wednesday

While others might be focused on prepping the meal...are you ready for Green and Blackout Wednesday?

MORE BY THIS AUTHOR

The Best Small Ways To Improve Work Productivity

Some days you feel like you are on a hamster wheel, here are the best small ways to improve work productivity.

Don't Miss Your Weekly Dose of The Fresh Toast.

Stay informed with exclusive news briefs delivered directly to your inbox every Friday.

We respect your privacy. Unsubscribe anytime.