Technological redlining: how algorithms are tearing communities apart.

Algorithms make important decisions for people every day, from choosing a place to eat to deciding who gets the job. However, algorithms based on biased data can do more damage than good — they can discriminate against people from vulnerable communities.

Google’s algorithm labelled an African American person as “gorilla”
Google’s algorithm labelled an African American person as “gorilla”
Courtesy of Twitter: @jackyalcine

Algorithmic bias in the criminal justice system

Big data companies and social systems are using biased algorithms to make judgments on people’s lives. The skewed databases, that tend to favor the dominant and affluent populations, “redline” against vulnerable communities, especially women, people of color, and people who are non-binary.

Technological redlining

A historical map of redlining in New York City
A historical map of redlining in New York City
Courtesy of The New York Times

How can we build better systems?

To address the digital divide, tech companies have to re-evaluate their hiring process to ensure more inclusion of people of color, women, and people who are non-binary in the construction of big data.

twitter’s word count is too limiting so I’m looking for a bigger stage to shine. heyanh.com.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store