Technological redlining: how algorithms are tearing communities apart.

Anh Nguyen
6 min readNov 21, 2019

Algorithms make important decisions for people every day, from choosing a place to eat to deciding who gets the job. However, algorithms based on biased data can do more damage than good — they can discriminate against people from vulnerable communities.

In 2015, an African American programmer found that Google’s facial recognition software labeled him and his friend, who is also black, as “gorillas.” During Obama’s presidency, Google Map searches with the n-word led to the White House in the D.C. area.

Google’s algorithm labelled an African American person as “gorilla”
Courtesy of Twitter: @jackyalcine

In both incidences, a Google spokesperson apologized, saying that the company would take “immediate action” to rectify their mistakes; still, the search engine has been under fire with reports of racism and sexism against marginalized communities.

“What we’re seeing is the almost inevitable consequence of an ‘arms race’ in data,” iSchool professor, Anna Lauren Hoffmann said in an interview with me. “We have a set of policy arrangements, cultural norms, and ideas that allow us to think that data is valuable. And because data is valuable, people are going to exploit.”

People without tech literacy tend to fall for the excuse of “bugs” and “glitches” from engineers for most of technology’s problems, but the main culprit of digital discrimination is caused by the people who put their biases into computer codes.

“The math-powered applications powering the data economy were based on choices made by fallible human beings,” Cathy O’Neil explained in her book Weapons of Math Destructions. “They tend to punish the poor and the oppressed in our society, while making the rich richer.”

Tech companies classify users into different categories — usually by race, gender, socio-economics — to infer what the users desire and what they are likely to do. But this phenomenon, known as “social profiling,” is not a neutral mathematical model that most people think to be true. The people who code these systems hold all types of human biases that influence their algorithms.

“The primary creators in Silicon Valley of the technologies are white and Asian men, but mostly white men,” Safiya Nobles said in an interview. “They make the technology in their image and in their interests.”

The prevalence of the “brogrammer” culture finds its roots in computer departments in universities where the majority of STEM students are white and Asian males. This masculine stereotype discourages female students from considering the major because they don’t feel a sense of belonging in the first place, The Daily reported.

“A lot of toxic behaviors are so common in spaces with predominantly white males that they become normalized,” computer science junior Linda Vong said in an interview with me. “They foster certain beliefs about who can hold knowledge, who can belong, and who cannot.”

Vong also believes that computer departments could incorporate discussions about the lack of diversity and of inclusive culture in their curriculum to address workplace inequality.

Algorithmic bias in the criminal justice system

Big data companies and social systems are using biased algorithms to make judgments on people’s lives. The skewed databases, that tend to favor the dominant and affluent populations, “redline” against vulnerable communities, especially women, people of color, and people who are non-binary.

Idemia, a French multinational, uses facial recognition software to scan millions of people’s faces in the United States. One of the company’s 3D high-speed face-capture system, Mface, has been used at ports in Florida to scan cruise passengers boarding into the United States.

However, a study by the National Institute of Standards and Technology (NIST), that assesses the reliability of Idemia, found that the algorithm is significantly more likely to err black women, at a rate of one in 1,000, compared to that of white women at one in 10,000 — this means a black woman is 10 times more likely to be mistaken with another black woman than their white counterparts.

The criminal justice system is also currently using this underdeveloped technology in risk assessment to predict whether a criminal is likely to go to trial, commit another crime, or show up at a hearing.

However, the big data that supplements criminal justice algorithms is increasingly flawed.

“Black people are more likely to be over-assessed as risky, and white people are under-assessed as risky,” the director of American Civil Liberties Union of Washington Shankar Narayan said at the 2019 Global Challenges lecture, which I attended, at the University of Washington.

Technological redlining

A historical map of redlining in New York City
Courtesy of The New York Times

“Technological redlining” is the perpetuation of racial, cultural, and economic inequities in technologies. The concept of “redlining” stems from a history of housing discrimination since the 1930s where red lines were (literally) drawn on maps to segregate the poor and dominantly black neighborhoods.

Today, redlining appears virtually in our devices and is regulating the decisions we make. Banks offer high-cost mortgages to people living in minority zip codes; internet service providers don’t provide high-speed services to low-income communities; advertisers target ads for housing and employment based on race, gender, and geography.

According to Narayan, the cause of technological redlining stems from two key actors: the people who design the technology and those who are impacted by it without “being in the room.” The unequal power dynamics, between creators and users, disempowers marginalized voices because the technology does not speak for their needs.

“Those choices are built on assumptions and prejudices about people, intimately weaving them into process and results that reinforce biases and, worse, make them seem natural or given,” Hoffmann wrote in an article.

How can we build better systems?

To address the digital divide, tech companies have to re-evaluate their hiring process to ensure more inclusion of people of color, women, and people who are non-binary in the construction of big data.

An infographic by Women In Tech states that only 25% of women hold IT jobs and 11% of women are executives at Fortune 500 companies. In start-up businesses, female founders only receive 2.2% of $130 billion venture capital (VC) funding.

In contrast, white men dominate the majority of IT roles and college computer departments. Without diversity in computer education, the same male engineers create technology that is based on their privilege of being the dominant race and gender, thus turning a blind eye to the oppressive experiences of other identities.

“They’re not thinking about women,” Noble said. “Silicon Valley is rife with all kinds of deeply structural problems and inequalities with respect to women.”

However, Hoffmann argues that since online inequalities are a result of offline inequities, even a fair algorithmic system in an unequal world can only produce unequal outcomes.

“There is a lot of research that tries to make algorithms fairer, like a more inclusive training data for machine-learning algorithms,” Hoffmann said. “All of these solutions ignore the fact that these algorithms are released into a world already marked by inequality.”

In Noble’s book, she gave three critical aspects of technological redlining that need to be addressed: lack of access to computers and software, lack of training in computer technologies, and unequal access to Internet connectivity.

The author also urges a diverse collaboration of social scientists, technologists, and policymakers to regulate automated decision-making systems.

This, Hoffmann agrees.

“We need empathy and thoughtfulness in the design of algorithms and data science if we are to change the damaging cultural narratives that reinforce injustice and inequality for vulnerable people,” Hoffmann wrote.

--

--

Anh Nguyen

twitter’s word count is too limiting so medium is my bigger stage to shine.