This Article examines the role military automated surveillance and intelligence systems and techniques have supported a self-reinforcing racial bias when used by civilian police departments to enhance predictive policing programs. I will focus on two facets of this problem. First, my research will take an inside-out perspective, studying the role played by advanced military technologies and methods within civilian police departments, and how they have enabled a new focus on deterrence and crime prevention by creating a system of structural surveillance where decision support relies increasingly upon algorithms and automated data analysis tools, and which automates de facto penalization and containment based on race. Second, I will explore these systems—and their effects—from an outside-in perspective, paying particular attention to racial, societal, economic, and geographic factors that play into the public perception of these policing regimes. I will conclude by proposing potential solutions to this problem, which incorporate tests for racial bias to create an alternative system that follows a true community policing model.
Vagle, Jeffrey L., "Tightening the OODA Loop: Police Militarization, Race, and Algorithmic Surveillance" (2016). Faculty Scholarship at Penn Law. 1630.
Constitutional Law Commons, Criminal Procedure Commons, Criminology Commons, Criminology and Criminal Justice Commons, Fourth Amendment Commons, Inequality and Stratification Commons, Law and Race Commons, Law and Society Commons, Law Enforcement and Corrections Commons, Military and Veterans Studies Commons, Policy Design, Analysis, and Evaluation Commons, Policy History, Theory, and Methods Commons, Public Law and Legal Theory Commons, Race and Ethnicity Commons, Science and Technology Studies Commons, Social Control, Law, Crime, and Deviance Commons, Social Statistics Commons