{"id":24008,"date":"2019-10-11T07:00:00","date_gmt":"2019-10-11T11:00:00","guid":{"rendered":"https:\/\/thesocietypages.org\/cyborgology\/?p=24008"},"modified":"2019-10-10T23:38:22","modified_gmt":"2019-10-11T03:38:22","slug":"tracking-changes-the-digitization-of-justice","status":"publish","type":"post","link":"https:\/\/thesocietypages.org\/cyborgology\/2019\/10\/11\/tracking-changes-the-digitization-of-justice\/","title":{"rendered":"Tracking Changes: The Digitization of Justice"},"content":{"rendered":"<p><a href=\"https:\/\/thesocietypages.org\/cyborgology\/files\/2019\/10\/monitor-1054708_960_720.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-24009 size-large\" src=\"https:\/\/thesocietypages.org\/cyborgology\/files\/2019\/10\/monitor-1054708_960_720-500x341.jpg\" alt=\"\" width=\"500\" height=\"341\" srcset=\"https:\/\/thesocietypages.org\/cyborgology\/files\/2019\/10\/monitor-1054708_960_720-500x341.jpg 500w, https:\/\/thesocietypages.org\/cyborgology\/files\/2019\/10\/monitor-1054708_960_720-250x170.jpg 250w, https:\/\/thesocietypages.org\/cyborgology\/files\/2019\/10\/monitor-1054708_960_720-400x273.jpg 400w, https:\/\/thesocietypages.org\/cyborgology\/files\/2019\/10\/monitor-1054708_960_720-768x523.jpg 768w, https:\/\/thesocietypages.org\/cyborgology\/files\/2019\/10\/monitor-1054708_960_720.jpg 960w\" sizes=\"auto, (max-width: 500px) 100vw, 500px\" \/><\/a><\/p>\n<p>As technology expands its footprint across nearly every domain of contemporary life, some spheres raise particularly acute issues that illuminate larger trends at hand. The criminal justice system is one such area, with automated systems being adopted widely and rapidly\u2014and with activists and advocates beginning to push back with alternate politics that seek to ameliorate existing inequalities rather than instantiate and exacerbate them. The criminal justice system (and its well-known subsidiary, the prison-industrial complex) is a space often cited for its dehumanizing tendencies and outcomes; technologizing this realm may feed into these patterns, despite proponents <a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2019-10-02\/apax-partners-says-warren-ocasio-cortez-shouldn-t-target-firm\">pitching this<\/a> as an \u201calternative to incarceration\u201d that will promote more humane treatment through rehabilitation and employment opportunities.<\/p>\n<p>As such, calls to modernize and reform criminal justice often manifest as a <a href=\"https:\/\/www.purdueglobal.edu\/blog\/criminal-justice\/growing-role-technology-criminal-justice\/\">rapid move toward automated processes<\/a> throughout many penal systems. Numerous jurisdictions are adopting digital tools at all levels, from policing to parole, in order to promote efficiency and (it is claimed) fairness. However, critics argue that mechanized systems\u2014driven by Big Data, artificial intelligence, and human-coded algorithms\u2014are ushering in an era of expansive policing, digital profiling, and punitive methods that can intensify structural inequalities. In this view, the embedded biases in algorithms can serve to deepen inequities, via automated systems built on platforms that are opaque and unregulated; likewise, emerging policing and surveillance technologies are often deployed disproportionately toward vulnerable segments of the population. In an era of digital saturation and rapidly shifting societal norms, these contrasting views of <em>efficiency<\/em> and <em>inequality<\/em> are playing out in quintessential ways throughout the realm of criminal justice.<!--more--><\/p>\n<p>Tracking this arc, critical discourses on technology and social control have brought to light how decision-making algorithms can be a mechanism to \u201creinforce oppressive social relationships and enact new modes of racial profiling,\u201d as Safiya Umoja Noble <a href=\"https:\/\/nyupress.org\/9781479837243\/algorithms-of-oppression\/\">argues<\/a> in her 2018 book, <em>Algorithms of Oppression<\/em>. In this view, the use of machine learning and artificial intelligence as tools of justice can yield self-reinforcing patterns of racial and socioeconomic inequality. As Cathy O\u2019Neil <a href=\"https:\/\/weaponsofmathdestructionbook.com\/\">discerns<\/a> in <em>Weapons of Math Destruction<\/em> (2016), emerging models such as \u201cpredictive policing\u201d can exacerbate disparate impacts by perpetuating data-driven policies whereby, \u201cbecause of the strong correlation between poverty and reported crime, the poor continue to get caught up in these digital dragnets.\u201d And in <em>Automating Inequality <\/em>(2018), Virginia Eubanks <a href=\"https:\/\/us.macmillan.com\/books\/9781250074317\">further explains<\/a> how marginalized communities \u201cface the heaviest burdens of high-tech scrutiny,\u201d even as \u201cthe widespread use of these systems impacts the quality of democracy for us all.\u201d In <a href=\"https:\/\/intersections.humanities.ufl.edu\/events\/halfway-home-race-punishment-and-the-afterlife-of-mass-incarceration\/\">talks<\/a> deriving from his forthcoming book <em>Halfway Home<\/em>, Reuben Miller advances the concept of \u201cmass supervision\u201d as an extension of systems of mass incarceration; whereas the latter has drawn a great deal of critical analysis in recent years, the former is potentially more dangerous as an outgrowth of patterns of mass surveillance and the erosion of privacy in the digital age\u2014leading to what Miller terms a \u201csupervised society.\u201d<\/p>\n<p>Techniques of digital monitoring impact the entire population, but the leading edge of regulatory and punitive technologies are applied most directly to communities that are already over-policed. Some scholars and critics have been describing these trends under the banner of \u201cE-carceration,\u201d calling out methods that utilize tracking and monitoring devices to extend practices of social control that are doubly (though not exclusively) impacting vulnerable communities. As Michelle Alexander <a href=\"https:\/\/www.nytimes.com\/2018\/11\/08\/opinion\/sunday\/criminal-justice-reforms-race-technology.html\">recently wrote<\/a> in the <em>New York Times<\/em>, these modes of digital penality are built on a foundation of \u201ccorporate secrets\u201d and a thinly veiled impetus toward \u201cperpetual criminalization,\u201d constituting what she terms \u201cthe newest Jim Crow.\u201d Nonetheless, while marginalized sectors are most directly impacted, as one of Eubanks\u2019s informants warned us all: \u201cYou\u2019re next.\u201d<\/p>\n<p>Advocates of automated and algorithmic justice methods often tout the capacity of such systems to reduce or eliminate human biases, achieve greater efficiency and consistency of outcomes, and ameliorate existing inequities through the use of better data and faster results. This trend is evident across a myriad of jurisdictions in the U.S. in particular (but not solely), as <a href=\"https:\/\/engineering.stanford.edu\/magazine\/article\/exploring-use-algorithms-criminal-justice-system\">courts nationwide<\/a> \u201care making greater use of computer algorithms to help determine whether defendants should be released into the community while they await trial.\u201d In 2017, for instance, New Jersey introduced a statewide \u201crisk assessment\u201d system using algorithms and large data sets to <a href=\"http:\/\/inthesetimes.com\/article\/21597\/jailed-by-an-algorithm-money-bail-racism-sentencing-bias-civil-rights\">determine bail<\/a>, in some cases serving to potentially <a href=\"https:\/\/www.wired.com\/story\/bail-reform-tech-justice\/\">supplant judicial discretion<\/a> altogether.<\/p>\n<p>Many have been critical of these processes, noting that these automated decisions are only as good as the data points utilized\u2014which are often tainted both by preexisting subjective biases and prior accumulations of structural bias recorded in people\u2019s records based on them. The algorithms deployed for these purposes are primarily <a href=\"https:\/\/epic.org\/algorithmic-transparency\/crim-justice\/\">conceived as<\/a> \u201cproprietary techniques\u201d that are largely opaque and obscured from public scrutiny; as a <a href=\"https:\/\/www.law.georgetown.edu\/american-criminal-law-review\/wp-content\/uploads\/sites\/15\/2019\/06\/56-4-Pandoras-Algorithmic-Black-Box-The-Challenges-of-Using-Algorithmic-Risk-Assessments-in-Sentencing.pdf\">recent law review article<\/a> asserts, we may be in the process of opening up \u201cPandora\u2019s algorithmic black box.\u201d In evaluating these emerging techniques, <a href=\"https:\/\/cyber.harvard.edu\/story\/2018-07\/algorithms-and-justice\">researchers at Harvard University<\/a> thus have expressed a pair of related concerns: (1) the critical \u201cneed for explainable algorithmic decisions to satisfy both legal and ethical imperatives,\u201d and (2) the fact that \u201cAI systems may not be able to provide\u00a0<strong>human-interpretable reasons for their decisions<\/strong>\u00a0given their complexity and ability to account for thousands of factors.\u201d This raises foundational questions of justice, ethics, and accountability, but in practice this discussion is in danger of being mooted by widespread implementation.<\/p>\n<p>The net effect of adopting digital mechanisms for policing and crime control without more scrutiny can yield a divided society in which the inner workings (and associated power relations) of these tools are almost completely opaque and thus shielded from critique, while the outer manifestations are concretely inscribed and societally pervasive. The CBC radio program SPARK <a href=\"https:\/\/www.cbc.ca\/radio\/spark\/406-tech-in-policing-1.4833189\">recently examined<\/a> a range of these new policing technologies, from Body Cams and virtual Ride-Along applications to those such as Shot Spotter that draw upon data gleaned from a vast network of recording devices embedded in public spaces. Critically assessing the much-touted benefits of such nouveau tools as a \u201cThin Blue Lie,\u201d Matt Stroud <a href=\"https:\/\/us.macmillan.com\/books\/9781250108296\">challenges the prevailing view<\/a> that these technologies are inherently helpful innovations, arguing instead that they have actually made policing more reckless, discriminatory, and unaccountable in the process.<\/p>\n<p>This has prompted a recent spate of critical interventions and resistance efforts, including a <a href=\"https:\/\/www.challengingecarceration.org\/\">network<\/a> galvanized under the banner of \u201cChallenging E-Carceration.\u201d In this lexicon, it is argued that \u201cE-Carceration may be the successor to mass incarceration as we exchange prison cells for being confined in our own homes and communities.\u201d The cumulative impacts of this potential \u201cnet-widening\u201d of enforcement mechanisms include new technologies that gather information about our daily lives, such as license plate readers and facial recognition software. As Miller suggested in his invocation of \u201cmass supervision\u201d as the logical extension of such patterns and practices, these effects may be most immediately felt by those already overburdened by systems of crime control, but the impacts are harbingers of wider forms of social control.<\/p>\n<p>Some advocates thus have begun calling for a form of \u201cdigital sanctuary.\u201d An important <a href=\"https:\/\/sunlightfoundation.com\/2017\/02\/10\/protecting-data-protecting-residents\/\">intervention<\/a> along these lines has been offered by the Sunlight Foundation, which advocates for \u201cresponsible municipal data management.\u201d Their detailed proposal begins with the larger justice implications inherent in emerging technologies, calling upon cities to establish sound digital policies: \u201cMunicipal departments need to consider their formal data collection, retention, storage and sharing practices, [and] their informal data practices.\u201d In particular, it is urged that cities should not collect sensitive information \u201cunless it is absolutely necessary to do so,\u201d and likewise should \u201cpublicly document all policies, practices and requests which result in the sharing of information.\u201d In light of the escalating use of data-gathering systems, this framework calls for protections that would benefit vulnerable populations and all residents.<\/p>\n<p>These notions parallel the emergence of a wider societal discussion on technology, providing a basis for assessing which current techniques present the greatest threats to, and\/or opportunities for, the cultivation of justice. Despite these efforts, we are left with critical questions of whether the debate will catch up to utilization trends, and how the trajectory of tools will continue to evolve if left unchecked. As Adam Greenfield <a href=\"https:\/\/www.versobooks.com\/books\/2742-radical-technologies\">plaintively inquired<\/a> in his 2017 book <em>Radical Technologies<\/em>: \u201cCan we make other politics with these technologies? Can we use them in ways that don\u2019t simply reproduce all-too-familiar arrangements of power?\u201d This is the overarching task at hand, even as opportunities for public oversight seemingly remain elusive.<\/p>\n<p>&nbsp;<\/p>\n<p><em>Randall Amster, J.D., Ph.D., is a teaching professor and co-director of environmental studies at Georgetown University in Washington, DC, and is the author of books including\u00a0<\/em><a href=\"http:\/\/www.peaceecology.com\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=http:\/\/www.peaceecology.com\/&amp;source=gmail&amp;ust=1570850117819000&amp;usg=AFQjCNEL00HM3uKtwOIUesy4UJeYxiykoA\"><em>Peace Ecology<\/em><\/a><em>. Recent work focuses on the ways in which technology can make people long for a time when children played outside and everyone was a great conversationalist. He cannot be reached on Twitter @randallamster.<\/em><\/p>\n<p>&nbsp;<\/p>\n<p>Headline pic via: <a href=\"https:\/\/pixabay.com\/illustrations\/monitor-monitor-wall-big-screen-eye-1054708\/\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>As technology expands its footprint across nearly every domain of contemporary life, some spheres raise particularly acute issues that illuminate larger trends at hand. The criminal justice system is one such area, with automated systems being adopted widely and rapidly\u2014and with activists and advocates beginning to push back with alternate politics that seek to ameliorate [&hellip;]<\/p>\n","protected":false},"author":1753,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[9967],"tags":[18606,29,4407,36425,55,14584,449,868,14,2143],"class_list":["post-24008","post","type-post","status-publish","format-standard","hentry","category-commentary","tag-algorithms","tag-class","tag-criminal-justice","tag-data","tag-gender","tag-guest-post","tag-justice","tag-power","tag-race","tag-surveillance"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/posts\/24008","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/users\/1753"}],"replies":[{"embeddable":true,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/comments?post=24008"}],"version-history":[{"count":4,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/posts\/24008\/revisions"}],"predecessor-version":[{"id":24013,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/posts\/24008\/revisions\/24013"}],"wp:attachment":[{"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/media?parent=24008"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/categories?post=24008"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/thesocietypages.org\/cyborgology\/wp-json\/wp\/v2\/tags?post=24008"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}