Data and Tech Use Within Our Justice System — A Primer

Edafe Onerhime
Data, Tech & Black Communities
4 min readMar 3, 2021

--

Blue banner with text ‘Data, Tech & Black Communities’ and headline ‘Data and tech use within our justice system’. To the right of the image are scale, to the left, illustration of a Black woman holding a mobile phone.
Data and tech use within our justice system

The purpose of this blog was to provide a primer to participants attending a roundtable on the use of data and technology. It explains key terms used and provides relevant examples of how data and technology can negatively impact upon Black lives in the UK. Many of the points we raise apply to other marginalised communities too.

What are data and technology?

Data and technology are words that are understood in a multitude of ways. A common definition of data is: a collection of curated information that can take the form of numbers, words, pictures. An example is the Police National Computer database in England which holds a variety of data about a child or adult’s interaction with the police, including demographics, arrests, successful convictions, DNA, and fingerprints.

For our purposes, we can broadly think of technology as the various digital tools and systems (sometimes further enabled by data and machine learning) capable of delivering or supporting specific tasks. An example is the mobile phone extraction software that UK police forces use to collect data stored on people’s phones like their contacts, texts, pictures, and location information. This is despite the UK data regulation body (ICO) warning that using it can be excessive and undermine public confidence.

How are data and technology used within crime and justice?

Data and technology have been around as long as humans have used information and tools. They are undoubtedly a useful aid for the operation of our justice system and external scrutiny. However, the increasing sway of opaque technology over decision-making (with or without human oversight) is a cause for concern. This is especially true when there’s no accountability for instances of poor decision-making or transparency about how that decision was arrived at.

Here is an example to get you thinking. The deployment of facial recognition technology is generally concerning as it marks the ability of those in power to surveil and identify masses of people in real-time. The technology has also been shown to have lower accuracy rates for identifying Black people, especially women. We need to question how this technology may be used to reinforce current structural injustices. As a Black person, how comfortable do you feel about giving the police more powers to surveil, given the implications of over-policing for Black communities? In a case brought by Ed Bridges and the charity Liberty, South Wales Police were found to be using facial recognition technology unlawfully “the force did not take reasonable steps to find out if the software had a racial or gender bias”. This extends beyond state use, retailers like the Co-op are using facial recognition to track shoplifters.

This is why it’s important to understand why data is being collected (or not), how it might be used, as well as how and which technologies are being deployed. We think this is especially true for marginalised groups.

We are collating reports and articles which capture the impact of data and technology on Black people and Black activists here. Do let us know of other resources we can add to this open library.

What should we do?

The round table is not going to be the forum for solving the issue of greater scrutiny and accountability around the use of data and technology across the UK’s justice system. But we will explore:

  • Our collective understanding and concerns about how data and technology are used in education and how these things play out in our own lives and communities;
  • Ways we can stay informed about how data and technology are being developed and deployed;
  • What a network capable of holding institutions and organisations to account for their use of data and technology, could look like?
  • Who else should be included in this discussion?

Key Terms

Data: curated information such as numbers, words, pictures collected for a specific purpose.

Technology: digital tools and systems (sometimes further enabled by data and machine learning) capable of delivering or supporting specific tasks.

Algorithms: a list of rules to follow in order to solve a problem. (See this great presentation from Edafe: What do we mean by an algorithm?).

Artificial Intelligence (AI): A catch-all phrase that can be misleadingly used. It covers everything from the aspirational (and currently unachieved) idea of a machine capable of general intelligence and true learning e.g. RoboCop — through to what is often called narrow AI where machines are very good at a single task, like playing chess.

Machine Learning: a branch of (narrow) AI where computer algorithms discover patterns within data and use them to build models that applies to new data that it is presented with. Accessible guide here.

Data Science: A subject/field of practice that transforms data into information, most commonly, by using machine learning techniques. Accessible guide here.

We ran roundtables in March 2021 connecting UK Black communities around Data, Tech &: Education, Crime & Justice, Employment & Enterprise, Health, and a final presentation.

View our presentation here: Roundtable Presentation.

Roundtable presentations: Crime and Justice

Data, Tech & Black Communities is a project funded by The National Lottery Community Fund. We respect your privacy.

--

--

Edafe Onerhime
Data, Tech & Black Communities

Edafe Onerhime specialises in making impact with data. Her motto: Data + Design + Culture. She lives in Glasgow, Scotland with her wife and cat. She/Her.