0
Reading a report from the ACLU about facial recognition error rates for women of color being up to 34% higher was a real gut punch.
I found that stat in a footnote on page 7 of their 2022 study on algorithmic bias, and it just made me think about how much we've blindly trusted these systems to be 'neutral' when they clearly aren't.
3 comments
Log in to join the discussion
Log In3 Comments
rowan_roberts4917d ago
That stat is honestly terrifying. It shows these systems are built on flawed data that doesn't represent everyone. Makes you question where else this bias is hiding.
7
karen_perry3817d ago
So where do we even start to fix something that broken?
4
lindamartin6d ago
Notice how even simple things like search results or movie suggestions feel off sometimes? Makes you wonder what else is quietly skewed without us knowing.
3