I went to a film screening and discussion last week about Mark Duggan’s shooting by the police, and the conditions surrounding him and his family Amponsah (2016). Hard Stop.. The event was one of the best I’ve been to. The documentary itself was authentic, engaging, and honest, showing a slice of life in London in a similar fashion to an episode of The Wire. The discussion after was a real eye-opener for me: many speakers across two panels with so much to share and say. At the same time, I was rolling an idea around for this blogpost for Open Data Manchester’s Echo Chambers and Post-Fact Politics event.
Police use of body-worn cameras (BWCs) came up in the discussion at the end. A recent study showed that the presence of BWCs reduces complaints against officers by 93% Ariel, B. et. al. (2016). “Contagious Accountability”: A Global Multisite Randomized Controlled Trial on the Effect of Police Body-Worn Cameras on Citizens’ Complaints Against the Police. Criminal Justice and Behavior.. The reasons for the reduction are still under debate, but the authors propose an idea of “contagious accountability”. The research methodology was to get half of the shifts at participating police forces to wear BWCs, and half to be the control. One key finding was that both the control and treatment groups had a marked reduction in complains. The authors suggest that the introduction of BWCs engendered an atmosphere of officers obeying the rules, becoming more accountable, even if they were not wearing the cameras.
However, the anecdotal but widely corroborated feeling at the event was that the police are still stop and searching young black people illegally, even wearing the BWCs: the children and young adults simply don’t know what the procedure is so they don’t ask for it. The BWC footage isn’t reviewed because, paraphrasing one speaker, “if noone gets shot, noone looks at the footage”. And noone, especially noone who the police harass already, is going to make a fuss over missing documentation.
Then it hit me: we have the data, but choose to ignore it. Much as racism is meaningless without social power I use the definition that “[…] racism is prejudice plus power, and therefore people of colour cannot be racist against whites […]. People of colour can be prejudiced against whites, but clearly do not as a group have the power to enfore that prejudice” (Katz, 2003), so too is data.
We have an enormous amount of data spanning decades on the institutional racism in the UK. 1578 deaths in police custody or following contact with the police since 1990, with 0 convictions inquest.org.uk. We know that Muslim candidates are 2.5 times less likely to get a job than their Caucasian counterparts with identical CVs Adida et. al (2010). Identifying barriers to Muslim integration in France.. We know that black people are 17 times more likely to be stop and searched than white in some areas The Independent, August 2015. So where is our “big data” and “data science” for racism?
When we think of an echo chamber we tend to think of the social media bubble, but we think less about the echo chambers of our communities of choice or employment. Social media has strongly shifted the emphasis on who we talk to: from communities of location or demographic, to communities of choice. I think to break out of these bubbles, we need to look locally, look at power and who has it, and look at habitus: all technology is a product of its social context, and the agendas of the people creating it.
Currently there’s a huge amount of hype and money around the tech sector. At the screening, it was commented that the University of Manchester barely represents the communities in Moss Side and Salford around it; so too our startups and tech events poorly reflect those they try and represent (or at least, should be trying to represent). I wonder if this lack of interaction is inadvertently simply reproducing the same power imbalance in the world as a whole, and that by not being embedded in community organisations, not working with community groups to understand and create solutions together, we are in no position to challenge the “echo chambers” that are currently emerging.
By way of example, Tin Geber, a technologist involved in refugee action, asks people simply to stop making apps Geber (2016). Hackathon and refugees: we can do better. Apps, to Geber, do nothing to solve the important problems facing refugees. He urges readers to get involved first, and listen a while, before trying to plough on with a solution. By contrast the app is the almost default strategy of the technology initiative or hack day. There are dozens of hack days in Manchester. As far as I can tell, none of these have focussed on topics as difficult or complex as racism, sexism, migration or homophobia Please correct me if you know of any, or indeed want to set one up!, in favour of easier topics like connecting your toaster to the internet⸮ This is not to bash on hobbyists, but an acknowledgement that as technologists, we should be engaging in social issues first, and building software second.
We need to be drawn not to the extreme, exciting, new, or sexy technology, but to the needs of our community: which should be just as exciting in a different way. We need to spend a lot more time listening and engaging and being part of solutions. We need to be compassionate, talk about lacks and power, and look without and within to understand and help with problems. Borrowing Paulo Freire’s critique of education: social good is not a bucket you fill, in order to deliver it to someone who needs it. Social good is working together to challenge the conditions that create inequality in the first place. I feel there can be no solutions that don’t address both the stark contrast between online and physical communities, and the erosure of the subtle in favour of the instant headline.
As socially engaged citizens who work with data, we cannot ignore the power structures that create, manipulate, publish, and use, data. Just as data is only given context by interpretation, so too we need to analyse the power structures that surround it. Only by working with communities to use data to its fullest extent can we hope to challenge the inequality in the world today. Let’s let technology ride the back seat for a while and get back to getting to know the people around us.