Your Data Is Discriminating...Against You
For some, privacy infringement doesn’t just mean annoying ads; it could mean being denied a job or housing. Prachi Gupta investigates big data’s big problem.
In 2016, Carmen Arroyo’s 22-year-old son, Mikhail, regained consciousness from a six-month coma. He had been electrocuted while atop an electrical pole and had fallen nearly 30 feet, leaving him unable to walk, speak, or take care of himself. Arroyo, then 44, filed an application with her landlord requesting permission for her son to move into her apartment in Willimantic, Connecticut, with her. According to court records, the application was quickly denied without explanation, and Mikhail was sent to a rehabilitation facility, where he would remain for more than a year while his mother searched for a reason why.
Arroyo contacted the Connecticut Fair Housing Center (CFHC), a nonprofit that provides free legal services to alleged victims of housing discrimination. In the process of filing a complaint against the landlord, Arroyo and her lawyers discovered that the landlord didn’t know why Arroyo was denied either; the decision hadn’t been made by him but by an algorithm used by CoreLogic, a software company he had enlisted to screen potential tenants. After Arroyo filed her complaint, the landlord allowed Mikhail to move in with his mother. Arroyo’s lawyers kept digging and ultimately determined what caused the rejection: a citation for shoplifting from 2014 (which has been withdrawn), according to court documents. “He was blacklisted from housing, despite the fact that he is so severely disabled now and is incapable of committing any crime,” says Salmun Kazerounian, a staff attorney from CFHC who represents Arroyo.
What happened to the Arroyo family is just one example of data leading to discrimination. Automated data systems—technology like CoreLogic’s—use collected intel (public data, such as DMV and court records, that may also be packaged with information scraped from the Internet, like social-media activity) to make life-altering decisions, including whether applicants get jobs, the cost of their insurance, or how a community is policed. In theory, these systems are built to eliminate bias present in human decision-making. In reality, they can fuel it.
That is in part because algorithms are made up of biased data and often don’t consider other relevant factors. Because low-income people have more contact with government agencies (for benefits like Medicaid), a disproportionate amount of their info feeds these systems. Not only can this data fall into corporate hands, but the government itself uses it to surveil. For example, when UC Berkeley law professor Khiara Bridges interviewed pregnant women applying for prenatal care under Medicaid, she found that they had to reveal their sexual histories and incidences of domestic violence—details that can then be shared with other public agencies. “I talked to pregnant women who came to the clinic just to get prenatal care, and then the next day they would get a call from Child Protective Services,” Bridges says. When people seek support from the state, “that can end up penalizing them later,” adds University of Baltimore law professor Michele E. Gilman. A person applying for a public benefit can be flagged as a risk, which limits future housing or employment opportunities. People who don’t need to apply for public benefits are exempt from these injustices.
The problem is pervasive, invisible, and cyclical: Biased data is used to justify surveillance, creating an endless feedback loop of discrimination. In 2012, the Chicago Police Department began using predictive analytics, reliant mostly on arrest-record data, to increase surveillance on certain individuals it considered more likely to commit, or be victims of, gun violence. The program was shelved in 2019 after findings showed it was ineffective at reducing homicides. Civil-rights groups said it perpetuated racial bias. “The algorithm is developed from what you give it,” says Brandi Collins-Dexter, senior campaign director for racial-justice-advocacy group Color of Change. “If you give it trash, it’s going to give you trash.” Feed an algorithm biased information and it will enable future bias.
This reality is the crux of the Arroyo case, the first of its kind: Mikhail, who is Latino, is one of the nearly one third of working-age Americans who have a criminal record, a disproportionate number of whom are Black or Latinx. His lawyers are suing CoreLogic, arguing that, under the guise of neutrality and efficiency, its software reinforces discriminatory policies. (The lawsuit is still pending, and CoreLogic has denied any wrongdoing.) If Arroyo wins, it will be a small step forward. But, unless the U.S. adopts stronger data-privacy legislation, these life-altering structures will go largely unchecked, as they are both “powerful and invisible,” according to Jennifer Lee of the ACLU of Washington. Her organization is just one pushing for regulations. Until that happens, we will continue to be watched by discriminatory systems unseen—and minority groups will feel those eyes most of all.
Algorithms can use data to make decisions about a specific person, like Arroyo, but some use it to make inferences on entire groups. This can lead to sweeping, discriminatory generalizations: Facebook has come under fire for showing different housing and job ads to different users based on race and gender; in 2018, Amazon discovered that its automated hiring tool picked male candidates over female ones. In these instances, bias was baked into decision-making tools. Even if the data is not specifically about race or gender, data points—like a person’s music interests or zip code—can become “proxies,” according to Collins-Dexter.
Stay In The Know
Get exclusive access to fashion and beauty trends, hot-off-the-press celebrity news, and more.
This story originally appeared in the Fall 2020 issue of Marie Claire.
RELATED STORIES
Prachi Gupta is an award-winning journalist based in New York. Her first book, about Rep. Alexandria Ocasio-Cortez, is available via Workman Publishing.
-
With the Mystery of the Morgan House Solved, Will 'No Good Deed' Return for Season 2? Here's What We Know
The dark comedy could be Netflix's latest hit to get the anthology treatment.
By Radhika Menon Published
-
Prince William Reveals the One Embarrassing Christmas Tradition He's Totally Against
"Some people don't even own one."
By Amy Mackelden Published
-
These Are a New Yorker's Favorite On-Sale Winter Jacket Styles
18 under-$300 finds that will make your outfit.
By Brooke Knappenberger Published
-
36 Ways Women Still Aren't Equal to Men
It's just one of the many ways women still aren't equal to men.
By Brooke Knappenberger Last updated
-
How New York's First Female Governor Plans to Fight for Women If Reelected
Kathy Hochul twice came to power because men resigned amid sexual harassment scandals. Here, how she's leading differently.
By Emily Tisch Sussman Last updated
-
Why the 2022 Midterm Elections Are So Critical
As we blaze through a highly charged midterm election season, Swing Left Executive Director Yasmin Radjy highlights rising stars who are fighting for women’s rights.
By Tanya Benedicto Klich Published
-
Tammy Duckworth: 'I’m Mad as Hell' About the Lack of Federal Action on Gun Safety
The Illinois Senator won't let the memory of the Highland Park shooting just fade away.
By Sen. Tammy Duckworth Published
-
Roe Is Gone. We Have to Keep Fighting.
Democracy always offers a path forward even when we feel thrust into the past.
By Beth Silvers and Sarah Stewart Holland, hosts of Pantsuit Politics Podcast Published
-
The Supreme Court's Mississippi Abortion Rights Case: What to Know
The case could threaten Roe v. Wade.
By Megan DiTrolio Published
-
Sex Trafficking Victims Are Being Punished. A New Law Could Change That.
Victims of sexual abuse are quietly criminalized. Sara's Law protects kids that fight back.
By Dr. Devin J. Buckley and Erin Regan Published
-
My Family and I Live in Navajo Nation. We Don't Have Access to Clean Running Water
"They say that the United States is one of the wealthiest countries in the world. Why are citizens still living with no access to clean water?"
By Amanda L. As Told To Rachel Epstein Published