Mass Incarceration ❤️ Technology

by Jenn Schiffer

Preface

Early in 2018 I got involved in the Women's Prison Association, a really great organization based in NYC as an emerging philanthropist - which is in so many words their junior board. We've been meeting monthly to organize events to raise funds to help women of NYC and their families who are touched in some way by the criminal "justice" system (come to our fundraising gala tomorrow (6/20) and/or donate pls!!)

There are many reasons why this organization means so much to me, especially as both a prison abolitionist and a technologist who is seeing code already being used to further oppress marginalized communities. It's very clear that people of color are by far the most harmed by mass incarceration, and it's a vicious white supremacist cycle that I see no end to unless we dismantle the systems - not digitize them.

Anyway, I can speak more to my feelings on this, but today I wanted to share a talk I gave to my fellow EPs a couple of months ago at one of our monthly meetings. I recently saw some of my peers in the tech industry sharing frightening articles about police "starting" to use artificial intelligence and I feel it is my duty to share this talk now because it's not something they're starting to do, they've *been* doing it. I encourage you to not only read the articles included, but look further into the authors' work, because their continued coverage and exposing of data is key to helping us combat these issues.

This was a 10-15 minute slide-driven talk to women with all different kinds of backgrounds - most of them not in tech, a few lawyers, and many working day to day on the efforts of helping the women of NYC stay out of prison and re-enter society. That's why this talk is NYC focused and not "for software engineers" - but it's very likely that your city/state is practicing the same digital oppression of one of the largest incarcerated populations in the world, and that my fellow software engineers can learn a lot from how the code we write can be harming people and destroying what justice *should* be in our society.

The Talk

slide 1

Mass Incarceration ❤️ Technology.
Jenn Schiffer, Director of Community Engineering, Glitch.com

slide 2

I build software.

My day job is leading a team of engineers and designers building software along with support/health people helping to maintain the safety and success of our community of people building software.

slide 3

Software is a set of instructions that tell a computer how to work.

slide 4

Software is used by all kinds of users to solve problems that are typically slow to solve without software.

slide 5

Software users include internet surfers, online shoppers, smartphone owners, drivers, taxi riders, kitchen appliance owners, libraries, government, schools...

slide 6

...and the Criminal Justice System.

slide 7

How the system is using technology - spoiler alert: it's not great!!

The cycle of incarceration I'll go through in talking about how technology is used is

  • Before prison
  • Inside prison
  • After prison

slide 9

Before prison

So before someone ends up incarcerated, they are typically touched by both databases and/or algorithms which may push them into the "inside prison" part of the cycle.

slide 10

Databases

What are they? Databases are sets of data saved on a computer and structured in a way that the data can be easily accessed and used in software. For example, a search engine is software that lets you enter a search query and returns results from its database of files that match your search query.

Are they good? Databases can be very helpful in organizing data, but only if that data is correct. Imagine: searching a recipe database for recipes that are vegetarian, and all of the results have meat in the ingredients but the database still labels it as vegetarian. What the hell!!

slide 11

Databases

Article in slide: "Like Chicago Police, Cook County and Illinois Officials Track Thousands of People in Gang Databases", Propublica Illinois

Police departments all over the country have been creating and "maintaining" databases of people they "deem" to be gang members, and I put "maintaining" in quotes because it's impossible to know how correct that data is – information about what exists in the database and how that data got there is typically kept from the public and those inside the database. And that data is likely wrong: for example, in the Chicago gang database mentioned in that Propublica article, two people in the database were listed as 132 years old. Hmmmmmmm....

slide 12

Algorithms

What now? Algorithms are logic written into software, typically to make predictions. For example, a social media site usually tracks how you interact with friends and content on the site and feeds that into an algorithm which sorts the order in which content shows on your timeline - with the intention of keeping you on the site longer and interacting with it more.

Are they good? Much like databases, algorithms are only as good as what is put into them. Data tends to be incorrect and biased when it comes from a broken system like mass incarceration. For example, imagine a company deciding what's for lunch based on an algorithm using past decisions where everyone was a meat-eater, yet starting today everyone is a vegetarian.

slide 13

See, here is the problem with algorithms...

Here's the part of the talk (and most of my talks lately, regardless of subject matter) where I reiterate that in living in a white supremacist and misogynist society, most (if not all) of our data is white supremacist and misogynist. By feeding these into machine learning (ML) and creating artificial intelligence (AI), we're just perpetuating those systemic issues and, in response to racist AI, passing the blame onto the "machines" when really it's a societal problem we (ie. white people, who benefit whether we "like it" or not from white supremacy) need to be held accountable for. Computers are supposed to solve problems *faster* for us, not with more empath. They (like many folks programming them) don't have empathy, which is one of the biggest issues I have with us using algorithms to make decisions like assigning parole and sentencing.

slide 14

Algorithms

Articles in slide:

Given what I stated before about bad data leading to bad AI, that's now happening in the criminal justice system, and it's perpetuating bias against black people when it comes to predicting whether a person is likely to commit a crime again. On top of that, if you challenge the use of algorithms like COMPAS to find out how and why a decision that will change your life, as Eric Loomis did, even the Supreme Court will likely reject or decline to hear your case.

To quote Ellora Thadaney Israni in the above mentioned op-ed, "Why are we allowing a computer program, into which no one in the criminal justice system has any insight, to play a role in sending a man to prison?"

slide 15

Bonus Bias™: Facial Recognition

Article in slide: "Florida Is Using Facial Recognition to Convict People Without Giving Them a Chance to Challenge the Tech", ACLU

This is what really is scary - our inability to challenge the technology. Every day we deal with shitty software bugs, our digital assistants not hearing exactly what we're saying, not detecting our faces, etc. Technology, built by flawed humans, is inherently flawed, so naturally this is going to lead to imprisoning innocent people - especially those with darker skin. I strongly believe the people deciding to use this software know what they're doing, and it's not playing fair or promoting safety - it's perpetuating white supremacy.

slide 16

Inside prison

Once you're in prison, technology is being used to affect your access to entertainment, communication, and privacy through data collection.

slide 17

Using technology to control and profit

What? Companies are making money creating and selling software and services to be used by prisons to further oppress its population. Consider this: phone calls in prison are already limited and expensive - how can prison possibly make it even harder for its population to communicate and learn while incarcerated?

Why? Jail personnel say it's for "safety" when it actuall it's to line their wallets and make their jobs easier at the cost of inmates' emotional and psychological well-being. Imagine: being incarcerated and being told that your free in-person visits from loved ones will now be replaced with $13 video calls so prisons can save money

It needs to be said that most contraband in prisons come from prison staff, which is why I have a very hard time believing this is being done for safety instead of money.

slide 18

Using technology to control and profit

Articles in slide:

There's also something to be said about how these decisions to replace tangible (human contact, affordable reading materials) with often flawed digital versions are not just financial issues - they're human rights and censorship issues. We need to advocate more for fighting those violations for *everyone* not just those of us not inside prison.

slide 19

Using technology to control and profit

Article in slide: "Prisons Across the U.S. Are Quietly Building Databases Of Incarcerated People's Voice Prints", The Intercept + The Appeal

There's nothing keeping companies and prisons from using data forcefully collected from prisoners to oppress the rest of us, either through using that data for software to surveil the rest of the population or from collecting our voice prints when we make phone calls to those in prison. We already know, now, that these databases are flawed and are designed to pull in and keep people in the system.

slide 20

After prison

Basically when you think you're out of the system, the vicious technology cycle continues in order to keep you in the system.

slide 21

So when will NY prisons start using this tech?

They........already are, and have been.
  • COMPAS is used by the parole board in NY, and it is *unknown* if the state has conduced any validity studies.
  • That story about voice print collection? It starts with an inmate in Sing Sing prison.
  • 11 days after trying a "more controlled inmate package program," Governor Cuomo directed the Department of Corrections to stop the pilot program. I encourage y'all to donate via NYC Books Through Bars!!
  • Huge shouts out to Legal Aid Society in helping folks file FOIL requests to see if they are in the NYPD gang database.
slide 22

Oh and Video calls?

Well I searched JPay and found our own Sing Sing Correctional Facility. Note that since I took the screenshot in that slide, the prices have changed. The rates still don't make sense, but they did update the phone number to one that looks realer than (999)999-9999.

slide 23

The future can be less scary than the present is.

It's going to take awareness, money, and politics, though.

slide 24

Use your wallet, vote and voice to hold technology companies and their customers (our politicians and prisons) accountable.

For every cool app you download or new device that you buy, do research and make a connection to how that or similar technology is being used to harm marginalized communities.

Amplify these stories!!

For my fellow software developers and others within the tech industry - we need to keep ourselves and our leadership honest about how the tools we're building can and will be used against people in a society that promotes mass incarceration!

Afterword

In closing, I want to ask my fellow engineers, who have been asking to see the slides of this talk, to think about this: it's very easy to work on cutting edge tools and make are with it, or work in R&D labs of big companies and think we're just making art and trying out cool stuff, but at the end of the day are we really just helping those companies prepare for big money government contracts?

There's definitely privilege to be recognized when being able to choose not to work with companies that work with ICE, but I am kind of at the point where I feel like the privilege lays more on the other side - the ability to work on code knowing your leadership is profiting off of oppression and you're making a salary off of it.

I don't have the answers right now to how to solve all of this, I'm basically at the level in my visibility and career and education where I am trying to bring awareness and raise questions. That upsets a lot of people, but so does mass incarceration and tearing apart families - so maybe it's time the rest of us get uncomfortable for once and use our power to do something powerful and positive.