In conjunction with the launch of our White Collar Crime Risk Zones application, The New Inquiry presents a reading list of critical writing on predictive policing from our archive and from our friends around the web.
From TNI’s archive
“Incalculable Loss,” by Manuel Abreu, The New Inquiry, August 19, 2014
"Algorithmic necropower—the computation of who should live and who should die—operates from the basis of the incalculability to discern “non-obvious associations.” Still, according to the U.S. Inspector General, “association does not imply a direct causal connection.” Instead, it “uncovers, interprets and displays relationships between persons, places, and events.” Algorithms escape the laws of cause and effect and operate in a fluid state of exception, encompassing the financial sector, the military-security nexus, and the entertainment industry. Although algorithms seem to allow Big Data to bypass human judgment, in fact a huge amount of labor is required to map associations and interpret the output. The algorithm itself has to be written by a human, and even then it only spits out data; people still have to decide what the data means. Ordinary language and the “ordinary actions” of post-digital citizens act as a database for algorithmic necropower to manipulate reality and generate threats."
“Invisible Images (Your Pictures Are Looking at You),” by Trevor Paglen, The New Inquiry, December 8, 2016
"There is a temptation to criticize algorithmic image operations on the basis that they’re often 'wrong'—that 'Olympia' becomes a burrito, and that African Americans are labeled as non-humans. These critiques are easy, but misguided. They implicitly suggest that the problem is simply one of accuracy, to be solved by better training data. Eradicate bias from the training data, the logic goes, and algorithmic operations will be decidedly less racist than human-human interactions. Program the algorithms to see everyone equally, and the humans they so lovingly oversee shall be equal. I am not convinced.
Ideology’s ultimate trick has always been to present itself as objective truth, to present historical conditions as eternal, and to present political formations as natural. Because image operations function on an invisible plane and are not dependent on a human seeing-subject (and are therefore not as obviously ideological as giant paintings of Napoleon) they are harder to recognize for what they are: immensely powerful levers of social regulation that serve specific race and class interests while presenting themselves as objective."
“Summer Heat,” by Mariame Kaba, The New Inquiry, June 8, 2015
"A persistent and seemingly endemic feature of U.S. society is the conflation of Blackness and criminality. William Patterson, a well-known Black communist, wrote in 1970: 'A false brand of criminality is constantly stamped on the brow of Black youth by the courts and systematically kept there, creating the fiction that Blacks are criminally minded people.' He added that 'the lies against Blacks are propped up ideologically.' I would suggest that they are also maintained and enforced through force and violence."
“My Own Private Detroit,” by Muna Mire and Messiah Rhodes, The New Inquiry, October 12, 2015
"When it comes to public policing and security, Detroit operates using a modified template from America’s nefarious robber-baron past: it is a postindustrial city brazenly flouting decency for profit."
“Shades of Sovereignty,” by Maya Binyam, The New Inquiry, November 25, 2015
"Mechanisms of national security protect sovereignty, but so too do they do the work of selective disintegration, determining who is allowed to become 'singular' or 'individual' by accentuating the porosity of migratory bodies under surveillance. Policing, like networks of recruitment, relies on isolation, indoctrination, and control."
“The Virtual Watchers,” by Joana Moll and Cédric Parizot, The New Inquiry, October 20, 2015
"The Virtual Watchers, being an interactive window that unveils and stages the many diversions and dysfunctions of a panoptic surveillance system at the U.S./Mexico border, also amplifies a dangerous condition of technology: the dilution of responsibility of individual actions, enhanced through technological environments designed to promote action and reject thought and reflection. This ultimately magnifies several outcomes of such techno-cultural construction, yet, as a final consideration, I’d like to highlight a critical one: the silent militarization of the civil society by means of gamification and free labor—a reality worth raising the alarm for."
“Sci-Fi Crime Drama with a Strong Black Lead,” by Heather Dewey-Hagborg, The New Inquiry, June 27, 2015
"Forensic DNA phenotyping has been making headlines recently when the company Parabon Nanolabs began offering a service to law enforcement in December of 2014 called “Snapshot,” which it claims 'produces a descriptive profile from any human DNA sample,' predicting 'physical characteristics including skin pigmentation, eye and hair color, face morphology, sex, and genomic ancestry.'"
“The Anxieties of Big Data,” by Kate Crawford, The New Inquiry, May 30, 2014
"If we take these twinned anxieties—those of the surveillers and the surveilled—and push them to their natural extension, we reach an epistemological endpoint: on one hand, the fear that there can never be enough data, and on the other, the fear that one is standing out in the data. These fears reinforce each other in a feedback loop, becoming stronger with each turn of the ratchet. As people seek more ways to blend in—be it through normcore dressing or hardcore encryption—more intrusive data collection techniques are developed. And yet, this is in many ways the expected conclusion of big data’s neopositivist worldview. As historians of science Lorraine Daston and Peter Galison once wrote, all epistemology begins in fear—fear that the world cannot be threaded by reason, fear that memory fades, fear that authority will not be enough."
“Predictive Analytics and Information Camouflage,” by Rob Horning, The New Inquiry, February 17, 2012
"That may seem unduly paranoid, but the track record of companies and states is hardly unblemished—and the scope of data collection assures that no one is innocent. The creation of new facts about people through data cross-pollination means that something that can be used as leverage with people will be generated."
“No Life Stories,” by Rob Horning, The New Inquiry, July 10, 2014
"Ubiquitous surveillance thus makes information overload everyone’s problem. To solve it, more surveillance and increasingly automated techniques for organizing the data it collects are authorized. Andrejevic examines the variety of emerging technology-driven methods meant to allow data to 'speak for itself.' By filtering data through algorithms, brain scans, or markets, an allegedly unmediated truth contained within it can be unveiled, and we can bypass the slipperiness of discursive representation and slide directly into the real. Understanding why outcomes occur becomes unnecessary, as long as the probabilities of the correlations hold to make accurate predictions."
“Data Streams,” by Hito Steyerl and Kate Crawford, The New Inquiry, January 23, 2017
"There’s also that really interesting history around IBM, of course back in 1933, long before its terrorist credit score, when their German subsidiary was creating the Hollerith machine. I was going back through an extraordinary archive of advertising images that IBM used during that period, and there’s this image that makes me think of your work actually: it has this gigantic eye floating in space projecting beams of light down onto this town below; the windows of the town are like the holes in a punch card and it’s shining directly into the home, and the tagline is 'See everything with Hollerith punch cards.' It’s the most literal example of 'seeing like a state' that you can possibly imagine. This is IBM’s history, and it is coming full circle. I completely agree that we’re seeing these historical returns to forms of knowledge that we’ve previously thought were, at the very least, unscientific, and, at the worst, genuinely dangerous."
From around the web
“What Amazon Taught the Cops,” by Ingrid Burrington, The Nation, May 27, 2015
"Thus far, in fact, predictive policing has been less Minority Report than Groundhog Day—that is, yet another iteration of the same data-driven policing strategies that have proliferated since the 1990s. As it’s currently implemented, predictive policing is more a management strategy than a crime-fighting tactic. Whether it works is perhaps not as useful a question as who it works for. Its chief beneficiaries aren’t patrol cops or citizens, but those patrol cops’ bosses and the companies selling police departments a technical solution to human problems."
“This Is a Story About Nerds and Cops: Predpol and Algorythmic Policing,” by Jackie Wang, loberry.tumblr.com, 2014
"There are three major social problems that accompany the widespread use and assessment of PredPol: 1) it concedes to the inevitability of crime and creates zones of paranoia, 2) it lends itself to the generation of false positives that can be used to promote the product, and 3) it depoliticizes policing and the construction of crime."
“Broken Windows, Broken Code,” by R. Joshua Scannell, Real Life, August 29, 2016
"What Bratton and Maple wanted was to build a digital carceral infrastructure, an integrated set of databases that linked across the various criminal-justice institutions of the city, from the police, to the court system, to the jails, to the parole office. They wanted comprehensive and real-time data on the dispositions and intentions of their 'enemies,' a term that Maple uses more than once to describe 'victimizers' who 'prey' on 'good people' at their 'watering holes.' They envisioned a surveillance apparatus of such power and speed that it could be used to selectively target the people, places, and times that would result in the most good collars. They wanted to stay one step ahead, to know where 'knuckleheads' and 'predators' would be before they did, and in so doing, best look to the police department’s bottom line. And they wanted it to be legal."
“Want to Predict the Future of Surveillance? Ask Poor Communities,” by Virginia Eubanks, The American Prospect, January 15, 2014
"Counterintuitive as it may seem, we are targeted for digital surveillance as groups and communities, not as individuals. Big Brother is watching us, not you. The NSA looks for what they call a 'pattern of life,' homing in on networks of people associated with a target. But networks of association are not random, and who we know online is affected by offline forms of residential, educational, and occupational segregation. This year, for example, UC San Diego sociologist Kevin Lewis found that online dating leads to fewer interracial connections, compared to offline ways of meeting. Pepper Miller has reported that sometimes, African Americans will temporarily block white Facebook friends so that they can have 'open, honest discussions' about race with black friends. Because of the persistence of segregation in our offline and online lives, algorithms and search strings that filter big data looking for patterns, that begin as neutral code, nevertheless end up producing race, class, and gender-specific results."
“When the Designer Shows Up In the Design,” by Lena Groeger, Propublica, April 4, 2017
"Typical crime-mapping tools are actually part of the problem of mass incarceration, because they frame crime in an oversimplified way—as bad acts to be eradicated, and not the product of a system whose heavy costs are often borne by the very population law enforcement is meant to protect.
'Rather than looking at where crimes are committed, we looked at where prisoners live,' Kurgan said in an interview for BOMB magazine, 'and the maps that resulted showed the urban costs of incarceration and suggested how those dollars might be better spent on investing in communities.'"
“Taser Will Use Police Body Camera Videos to ‘Anticipate Criminal Activity,’” by Ava Kofman, The Intercept, April 30, 2017
"When civil liberties advocates discuss the dangers of new policing technologies, they often point to sci-fi films like RoboCop and Minority Report as cautionary tales. In RoboCop, a massive corporation purchases Detroit’s entire police department. After one of its officers gets fatally shot on duty, the company sees an opportunity to save on labor costs by reanimating the officer’s body with sleek weapons, predictive analytics, facial recognition, and the ability to record and transmit live video. Although intended as a grim allegory of the pitfalls of relying on untested, proprietary algorithms to make lethal force decisions, RoboCop has long been taken by corporations as a roadmap. And no company has been better poised than Taser International, the world’s largest police body camera vendor, to turn the film’s ironic vision into an earnest reality."
“The Minority Report: Chicago’s New Police Computer Predicts Crimes, But Is it Racist?” by Matt Stroud, The Verge, February 19, 2014
What McDaniel didn’t know was that he had been placed on the city’s “heat list”—an index of the roughly 400 people in the city of "Chicago supposedly most likely to be involved in violent crime. Inspired by a Yale sociologist’s studies and compiled using an algorithm created by an engineer at the Illinois Institute of Technology, the heat list is just one example of the experiments the CPD is conducting as it attempts to push policing into the 21st century."
“Machine Bias,” by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, Propublica, May 23, 2016
"In forecasting who would re-offend, the recidivism algorithm made mistakes with black and white defendants at roughly the same rate but in very different ways.
• The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.
• White defendants were mislabeled as low risk more often than black defendants."
“Artificial Intelligence’s White Guy Problem,” by Kate Crawford, The New York Times, June 25, 2016
"Predictive programs are only as good as the data they are trained on, and that data has a complex history. Histories of discrimination can live on in digital platforms, and if they go unquestioned, they become part of the logic of everyday algorithmic systems."