Coded Bias follows M.I.T. Media Lab computer scientist Joy Buolamwini, along with data scientists, mathematicians, and watchdog groups from all over the world, as they fight to expose the discrimination within algorithms now prevalent across all spheres of daily life.
In an increasingly data-driven, automated world, the question of how to protect individuals’ civil liberties in the face of artificial intelligence looms larger by the day.
Coded Bias follows M.I.T. Media Lab computer scientist Joy Buolamwini, along with data scientists, mathematicians, and watchdog groups from all over the world, as they fight to expose the discrimination within algorithms now prevalent across all spheres of daily life.
While conducting research on facial recognition technologies at the M.I.T. Media Lab, Buolamwini, a "poet of code," made the startling discovery that some algorithms could not detect dark-skinned faces or classify women with accuracy. This led to the harrowing realization that the very machine-learning algorithms intended to avoid prejudice are only as unbiased as the humans and historical data programming them.
Status: | N/A |
---|---|
Last Modified: | 11/24/2024 |
Added on: | 5/9/2022 |