The Undue Influence of Surveillance Technology Companies on Policing | Elizabeth Joh

Elizabeth E. Joh; The Undue Influence of Surveillance Technology Companies on Policing; In Law Review Online, New York University (NYU); 2017-09; N pages; landing.
Elizabeth E. Joh is Professor of Law, School of Law, U.C. Davis.

Abstract

Conventional wisdom assumes that the police are in control of their investigative tools. But with surveillance technologies, this is not always the case. Increasingly, police departments are consumers of surveillance technologies that are created, sold, and controlled by private companies. These surveillance technology companies exercise an undue influence over the police today in ways that aren’t widely acknowledged, but that have enormous consequences for civil liberties and police oversight. Three seemingly unrelated examples—stingray cellphone surveillance, body cameras, and big data software—demonstrate varieties of this undue influence. The companies which provide these technologies act out of private self-interest, but their decisions have considerable public impact. The harms of this private influence include the distortion of Fourth Amendment law, the undermining of accountability by design, and the erosion of transparency norms. This Essay demonstrates the increasing degree to which surveillance technology vendors can guide, shape, and limit policing in ways that are not widely recognized. Any vision of increased police accountability today cannot be complete without consideration of the role surveillance technology companies play.

Contents

  1. INTRODUCTION
  2. I. EXAMPLES OF UNDUE INFLUENCE
    1. Stingray Cellphone Surveillance and Nondisclosure Agreements
      1. Nondisclosure Agreements
      2. Stingrays and the Fourth Amendment
      3. Secret Stingray Use
    2. Cornering the Market on Police Body Cameras
      1. When Product Design Is Policy
      2. Market Dominance
    3. Big Data Software and Proprietary Information
  3. II. THE HARMS OF UNDUE INFLUENCE
    1. Fourth Amendment Distortion
    2. Accountability by Design
    3. Outsourcing Suspicion and Obscuring Transparency
  4. III. MINIMIZING UNDUE INFLUENCE
    1. Local Surveillance Oversight
    2. Public Records Requests as Oversight
  5. CONCLUSION

Toward a Fourth Law of Robotics: Preserving Attribution, Responsibility, and Explainability in an Algorithmic Society | Pasquale

Frank A. Pasquale III; Toward a Fourth Law of Robotics: Preserving Attribution, Responsibility, and Explainability in an Algorithmic Society; Ohio State Law Journal, Vol. 78, 2017, U of Maryland Legal Studies Research Paper No. 2017-21; 2017-07-14; 13 pages; ssrn:3002546.

tl;dr → A comment for Balkin. To wit:
  1. Balkin should have supplied more context; such correction is supplied herewith.
  2. More expansive supervision is indicated; such expansion is supplied herewith.
  3. Another law is warranted; not a trinity, but perfection plus one more

Four Laws, here and previous:

  1. machine operators are always responsible for their machines.
  2. businesses are always responsible for their operators.
  3. machines must not pollute.
  4. A [machine] must always indicate the identity of its creator, controller, or owner.

Love the erudition; but this is just like planes, trains & automobiles.

Separately noted.

The Three Laws of Robotics in the Age of Big Data | Balkin

Jack M. Balkin  (Yale); The Three Laws of Robotics in the Age of Big Data; Ohio State Law Journal, Vol. 78, (2017), Forthcoming (real soon now, RSN), Yale Law School, Public Law Research Paper No. 592; 2016-12-29 → 2017-09-10; 45 pages; ssrn:2890965.

tl;dr → administrative laws [should be] directed at human beings and human organizations, not at [machines].

Laws

  1. machine operators are responsible
    [for the operations of their machines, always & everywhere]
  2. businesses are responsible
    [for the operation of their machines, always & everywhere]
  3. machines must not pollute
    [in a sense to be defined later: e.g. by a "tussle"]

Love the erudition; but none of this is new.

Separately noted.

Big Data Surveillance: The Case of Policing | Sarah Brayne

Abstract

This article examines the intersection of two structural developments: the growth of surveillance and the rise of “big data.” Drawing on observations and interviews conducted within the Los Angeles Police Department, I offer an empirical account of how the adoption of big data analytics does—and does not—transform police surveillance practices. I argue that the adoption of big data analytics facilitates amplifications of prior surveillance practices and fundamental transformations in surveillance activities.

  1. First, discretionary assessments of risk are supplemented and quantified using risk scores.
  2. Second, data are used for predictive, rather than reactive or explanatory, purposes.
  3. Third, the proliferation of automatic alert systems makes it possible to systematically surveil an unprecedentedly large number of people.
  4. Fourth, the threshold for inclusion in law enforcement databases is lower, now including individuals who have not had direct police contact.
  5. Fifth, previously separate data systems are merged, facilitating the spread of surveillance into a wide range of institutions.

Based on these findings, I develop a theoretical model of big data surveillance that can be applied to institutional domains beyond the criminal justice system. Finally, I highlight the social consequences of big data surveillance for law and social inequality.

Conclusions

Through a case study of the Los Angeles Police Department, this article analyzed the role of big data in surveillance practices. By socially situating big data, I examined why it was adopted, how it is used, and what the implications of its use are. Focusing on the interplay between surveillance practices, law, and technology offers new insights into social control and inequality. I argued that big data participates in and reflects existing social structures. Far from eliminating human discretion and bias, big data represents a new form of capital that is both a social product and a social resource. What data law enforcement collects, their methods for analyzing and interpreting it, and the way it informs their practice are all part of a fundamentally social process. Characterizing predictive models as “just math,” and fetishizing computation as an objective process, obscures the social side of algorithmic decision-making. Individuals’ interpretation of data occurs in preexisting institutional, legal, and social settings, and it is through that interpretive process that power dynamics come into play. Use of big data has the potential to ameliorate discriminatory practices, but these findings suggest implementation is of paramount importance. As organizational theory and literature from science and technology studies suggests, when new technology is overlaid onto an old organizational structure, longstanding problems shape themselves to the contours of the new technology, and new unintended consequences are generated. The process of transforming individual actions into “objective” data raises fundamentally sociological questions that this research only begins to address. In many ways, it transposes classic concerns from the sociology of quantification about simplification, decontextualization, and the privileging of measurable complex social phenomena onto the big data landscape. Surveillance is always ambiguous; it is implicated in both social inclusion and exclusion, and it creates both opportunities and constraints. The way in which surveillance helps achieve organizational goals and structure life chances may differ according to the individuals and institutions involved. Examining the means of big data surveillance across institutional domains is an open and timely line of inquiry, because once a new technology is disseminated in an institutional setting, it is difficult to scale back.

Mentions

  • Palantir
    • PredPol
    • Automatic License Plate Reader (ALPR)
    • Automatic Vehicle Locator (AVL)
    • Enterprise Master Person Index (EMPI)
  • Los Angeles Police Department (LAPD)
  • Big Data
  • risk scores
  • Smart Policiing Initiative
  • Department of Homeland Security (DHS)
  • Federal Bureau of Investigations (FBI)
  • Central Intelligence Agency (CIA)
  • Immigration and Customs Enforcement (ICE)
  • Los Angeles County Sheriff’s Department (LASD)
  • Fusion Center
  • JRIC
  • Real-Time Crime Analysis Center (RACR)
  • Crime Intelligence Detail (CID)
  • Operation LASER (Los Angeles’ Strategic Extraction and Restoration
  • Parole Compliance Unit
  • Field Fnterview (FI)
    program)

Argot

  • surveillant assemblage
  • trigger mechanism
  • in the system (to be ‘in the system’)
  • net widening
  • decontextualization
Origin
(at least)
  • Deleuze
  • Guattari
  • Marx; no: not that guy; the other one.
  • Pasquale

Quotes

<quote>Originally
intended for use in national defense, Palantir was initially partially funded by In-Q-Tel, the CIA’s venture capital firm. Palantir now has government and commercial customers, including the CIA, FBI, ICE, LAPD, NYPD, NSA, DHS, and J.P. Morgan. JRIC (the Southern California fusion center) started using Palantir in 2009, with the LAPD following shortly after. The use of Palantir has expanded rapidly through the Department, with regular training sessions and more divisions signing on each year. It has also spread throughout the greater L.A. region: in 2014, Palantir won the Request for Proposals to implement the statewide AB 109 administration program, which involves data integration and monitoring of the post-release community supervision population.</quote>

<quote>When I asked an officer to provide examples of why he stops people with high point values, he replied:

Yesterday this individual might have got stopped because he jaywalked. Today he mighta got stopped because he didn’t use his turn signal or whatever the case might be. So that’s two points . . . you could conduct an investigation or if something seems out of place you have your consensual stops.[7] So a pedestrian stop, this individual’s walking, “Hey, can I talk to you for a moment?” “Yeah what’s up?” You know, and then you just start filling out your card as he answers questions or whatever. And what it was telling us is who is out on the street, you know, who’s out there not neces- sarily maybe committing a crime but who’s active on the streets. You put the activity of . . . being in a street with maybe their violent background and one and one might create the next crime that’s gonna occur.

Promotions

References

  1. Angwin Julia. 2014. Dragnet Nation: A Quest for Privacy, Security, and Freedom in a World of Relentless Surveillance. New York: Times Books. Google Scholar
  2. Ball Kirstie, Webster Frank, eds. 2007. The Intensification of Surveillance: Crime, Terrorism & Warfare in the Information Age. London, UK: Pluto Press. Google Scholar
  3. Barley Stephen R. 1986. “Technology as an Occasion for Structuring: Evidence from Observations of CT Scanners and the Social Order of Radiology Departments.” Administrative Science Quarterly 31(1):78–108. Google Scholar CrossRef, Medline
  4. Barley Stephen R. 1996. “Technicians in the Workplace: Ethnographic Evidence for Bringing Work into Organization Studies.” Administrative Science Quarterly 41(3):404–441. Google Scholar CrossRef
  5. Barocas Solon, Selbst Andrew D. 2016. “Big Data’s Disparate Impact.” California Law Review 104:671–732. Google Scholar
  6. Becker Howard S. 1963. Outsiders: Studies in the Sociology of Deviance. New York: Free Press. Google Scholar
  7. Beckett Katherine, Nyrop Kris, Pfingst Lori, Bowen Melissa. 2005. “Drug Use, Drug Possession Arrests, and the Question of Race: Lessons from Seattle.” Social Problems (52)3:419–41. Google Scholar CrossRef
  8. Bittner Egon. 1967. “The Police on Skid-Row: A Study of Peace Keeping.” American Sociological Review 32(5):699–715. Google Scholar CrossRef
  9. Bonczar Thomas P., Herberman Erinn J. 2014. “Probation and Parole in the United States, 2013.” Washington, DC: Bureau of Justice Statistics.
  10. Bowker Geoffrey C., Star Susan Leigh. 2000. Sorting Things Out: Classification and Its Consequences. Cambridge, MA: MIT Press. Google Scholar
  11. boyd danah, Crawford Kate. 2012. “Critical Questions for Big Data: Provocations for a Cultural, Technological, and Scholarly Phenomenon.” Information, Communication & Society 15(5):662–79. Google Scholar CrossRef
  12. Braga Anthony A., Weisburd David L. 2010. Policing Problem Places: Crime Hot Spots and Effective Prevention. New York: Oxford University Press. Google Scholar CrossRef
  13. Braverman Harry. 1974. Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century. New York: Monthly Review Press. Google Scholar
  14. Brayne Sarah. 2014. “Surveillance and System Avoidance: Criminal Justice Contact and Institutional Attachment.” American Sociological Review 79(3): 367–91. Google Scholar Link
  15. Brown v. Plata. 2011. 563 U.S. 493.
  16. Browne Simone. 2015. Dark Matters: On the Surveillance of Blackness. Durham, NC: Duke University Press. Google Scholar CrossRef
  17. Carson E. Ann. 2015. “Prisoners in 2014.” Washington, DC: Bureau of Justice Statistics.
  18. Christin Angèle. 2016. “From Daguerreotypes to Algorithms: Machines, Expertise, and Three Forms of Objectivity.” ACM Computers & Society 46(1):27–32. Google Scholar CrossRef
  19. Cohen Stanley. 1985. Visions of Social Control: Crime, Punishment and Classification. Malden, MA: Polity Press. Google Scholar
  20. Deleuze Gilles, Guattari Felix. 1987. A Thousand Plateaus: Capitalism and Schizophrenia. Minneapolis: The University of Minnesota Press. Google Scholar
  21. DiMaggio Paul J., Powell Walter W. 1983. “The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields.” American Sociological Review 48(2):147–60. Google Scholar CrossRef
  22. Duster Troy. 1997. “Pattern, Purpose and Race in the Drug War.” Pp. 206–287 in Crack in America: Demon Drugs and Social Justice, edited by Reinarman C., Levine H. G. Berkeley: University of California Press. Google Scholar
  23. Duster Troy. 2005. “Race and Reification in Science.” Science 307:1050–51. Google Scholar CrossRef, Medline
  24. Epp Charles R., Maynard-Moody Steven, Haider-Markel Donald P. 2014. Pulled Over: How Police Stops Define Race and Citizenship. Chicago: University of Chicago Press. Google Scholar CrossRef
  25. Ericson Richard V., Haggerty Kevin D. 1997. Policing the Risk Society. Toronto: University of Toronto Press. Google Scholar
  26. Ericson Richard V., Haggerty Kevin D. 2006. The New Politics of Surveillance and Visibility. Toronto: University of Toronto Press. Google Scholar
  27. Espeland Wendy N., Vannebo Berit I. 2007. “Accountability, Quantification and Law.” Annual Review of Law and Society 3:21–43. Google Scholar CrossRef
  28. Executive Office of the President. 2014. “Big Data: Seizing Opportunities, Preserving Values.” Washington, DC: The White House.
  29. Feeley Malcolm M., Simon Jonathan. 1992. “The New Penology: Notes on the Emerging Strategy of Corrections and Its Implications.” Criminology 30(4):449–74. Google Scholar CrossRef
  30. Ferguson Andrew G. 2015. “Big Data and Predictive Reasonable Suspicion.” University of Pennsylvania Law Review 63(2):327–410. Google Scholar
  31. Fiske John. 1998. “Surveilling the City: Whiteness, the Black Man and Democratic Totalitarianism.” Theory, Culture and Society 15(2):67–88. Google Scholar Link
  32. Fiske Susan T., Taylor Shelley E. 1991. Social Cognition, 2nd ed. New York: McGraw-Hill. Google Scholar
  33. Foucault Michel. 1977. Discipline and Punish: The Birth of the Prison. New York: Random House. Google Scholar
  34. Fourcade Marion, Healy Kieran. 2013. “Classification Situations: Life-Chances in the Neoliberal Era.” Accounting, Organizations and Society 38:559–72. Google Scholar CrossRef
  35. Fourcade Marion, Healy Kieran. 2017. “Seeing like a Market.” Socioeconomic Review 15(1):9–29. Google Scholar
  36. Gandy Oscar H. 1993. The Panoptic Sort: A Political Economy of Personal Information. Boulder, CO: Westview Press. Google Scholar
  37. Gandy Oscar H. 2002. “Data Mining and Surveillance in the Post-9.11 Environment.” Presentation to the Political Economy Section, International Association for Media and Communication Research.
  38. Gandy Oscar H. 2009. Coming to Terms with Chance: Engaging Rational Discrimination and Cumulative Disadvantage. Farnham, UK: Ashgate Publishing. Google Scholar
  39. Garland David. 2001. The Culture of Control: Crime and Social Order in Contemporary Society. New York: Oxford University Press. Google Scholar
  40. Giddens Anthony. 1990. The Consequences of Modernity. Stanford, CA: Stanford University Press. Google Scholar
  41. Gilliom John. 2001. Overseers of the Poor: Surveillance, Resistance, and the Limits of Privacy. Chicago: University of Chicago Press. Google Scholar
  42. Gitelman Lisa, ed. 2013. Raw Data Is an Oxymoron. Cambridge, MA: MIT Press. Google Scholar
  43. Goffman Alice. 2014. On the Run: Fugitive Life in an American City. Chicago: University of Chicago Press. Google Scholar CrossRef
  44. Goffman Erving. 1963. Stigma: Notes on the Management of Spoiled Identity. Englewood Cliffs, NJ: Prentice-Hall. Google Scholar
  45. Gross Samuel R., Possley Maurice, Stephens Kalara. 2017. “Race and Wrongful Convictions in the United States.” National Registry of Exonerations, Newkirk Center for Science and Society, University of California-Irvine.
  46. Gustafson Kaaryn S. 2011. Cheating Welfare: Public Assistance and the Criminalization of Poverty. New York: NYU Press. Google Scholar CrossRef
  47. Guzik Keith. 2009. “Discrimination by Design: Predictive Data Mining as Security Practice in the United States’ ‘War on Terrorism.’” Surveillance and Society 7(1):1–17. Google Scholar
  48. Hacking Ian. 1990. The Taming of Chance. Cambridge, UK: Cambridge University Press. Google Scholar CrossRef
  49. Haggerty Kevin D., Ericson Richard V. 2000. “The Surveillant Assemblage.” British Journal of Sociology 51(4):605–622. Google Scholar CrossRef, Medline
  50. Harcourt Bernard E. 2006. Against Prediction: Profiling, Policing, and Punishing in an Actuarial Age. Chicago: University of Chicago Press. Google Scholar CrossRef
  51. Hindmarsh Richard, Prainsack Barbara, eds. 2010. Genetic Suspects: Global Governance of Forensic DNA Profiling and Databasing. Cambridge, UK: Cambridge University Press. Google Scholar CrossRef
  52. Innes Martin. 2001. “Control Creep.” Sociological Research Online 6:1–10. Google Scholar CrossRef
  53. Joh Elizabeth E. 2016. “The New Surveillance Discretion: Automated Suspicion, Big Data, and Policing.” Harvard Law and Policy Review 10(1):15–42. Google Scholar
  54. Kitchin Rob. 2014. The Data Revolution: Big Data, Open Data, Data Infrastructures & Their Consequences. London, UK: SAGE. Google Scholar CrossRef
  55. Kling Rob. 1991. “Computerization and Social Transformations.” Science, Technology & Human Values 16(3):342–67. Google Scholar Link
  56. Kohler-Hausmann Issa. 2013. “Misdemeanor Justice: Control without Conviction.” American Journal of Sociology 119(2):351–93. Google Scholar CrossRef
  57. Laney Doug. 2001. “3D Data Management: Controlling Data Volume, Velocity, and Variety.” Stamford, CT: META Group.
  58. Langton Lynn, Berzofsky Marcus, Krebs Christopher, Smiley-McDonald Hope. 2012. “Victimizations Not Reported to the Police, 2006-2010.” Washington, DC: Bureau of Justice Statistics.
  59. Laub John H. 2014. “Understanding Inequality and the Justice System Response: Charting a New Way Forward.” New York: William T. Grant Foundation.
  60. Lazer David, Radford Jason. 2017. “Data ex Machina: Introduction to Big Data.” Annual Review of Sociology 43:19–39. Google Scholar CrossRef
  61. Los Angeles Police Department. 2017. “Sworn Personnel by Rank, Gender, and Ethnicity (SPRGE) Report” (http://www.lapdonline.org/sworn_and_civilian_report). Google Scholar
  62. Lynch Michael, Cole Simon A., McNally Ruth, Jordan Kathleen. 2008. Truth Machine: The Contentious History of DNA Fingerprinting. Chicago: University of Chicago Press. Google Scholar CrossRef
  63. Lyon David. 1994. The Electronic Eye: The Rise of Surveillance Society. Minneapolis: University of Minnesota Press. Google Scholar
  64. Lyon David, ed. 2003. Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination. New York: Routledge. Google Scholar
  65. Lyon David, ed. 2006. Theorizing Surveillance: The Panopticon and Beyond. New York: Polity. Google Scholar
  66. Lyon David. 2015. Surveillance after Snowden. New York: Polity. Google Scholar
  67. MacKenzie Donald, Muniesa Fabian, Siu Lucia, eds. 2007. Do Economists Make Markets? On the Performativity of Economics. Princeton, NJ: Princeton University Press. Google Scholar
  68. Manning Peter K. 2011. The Technology of Policing: Crime Mapping, Information Technology, and the Rationality of Crime Control. New York: NYU Press. Google Scholar
  69. Manning Peter K., Van Maanen John, eds. 1978. Policing: A View from the Street. New York: Random House. Google Scholar
  70. Marx Gary T. 1974. “Thoughts on a Neglected Category of Social Movement Participant: The Agent Provocateur and the Informant.” American Journal of Sociology 80(2):402–442. Google Scholar CrossRef
  71. Marx Gary T. 1988. Undercover: Police Surveillance in America. Berkeley: University of California Press. Google Scholar
  72. Marx Gary T. 1998. “Ethics for the New Surveillance.” The Information Society 14(3):171–85. Google Scholar CrossRef
  73. Marx Gary T. 2002. “What’s New About the ‘New Surveillance’? Classifying for Change and Continuity.” Surveillance and Society 1(1):9–29. Google Scholar
  74. Marx Gary T. 2016. Windows into the Soul: Surveillance and Society in an Age of High Technology. Chicago: University of Chicago Press. Google Scholar CrossRef
  75. Mayer-Schönberger Viktor, Cukier Kenneth. 2013. Big Data: A Revolution That Will Transform How We Live, Work, and Think. New York: Houghton Mifflin Harcourt. Google Scholar
  76. Merton Robert K. 1948. “The Self-Fulfilling Prophecy.” The Antioch Review 8(2):193–210. Google Scholar CrossRef
  77. Meyer John W., Rowan Brian. 1977. “Institutionalized Organizations: Formal Structure as Myth and Ceremony.” American Journal of Sociology 83(2):340–63. Google Scholar CrossRef
  78. Mohler George O., Short Martin B., Malinowski Sean, Johnson Mark, Tita George E., Bertozzi Andrea L., Brantingham P. Jeffrey. 2015. “Randomized Controlled Field Trials of Predictive Policing.” Journal of the American Statistical Association 110:1399–1411. Google Scholar CrossRef
  79. Monahan Torin, Palmer Neal A. 2009. “The Emerging Politics of DHS Fusion Centers.” Security Dialogue 40(6):617–36. Google Scholar Link
  80. Moskos Peter. 2008. Cop in the Hood: My Year Policing Baltimore’s Eastern District. Princeton, NJ: Princeton University Press. Google Scholar
  81. Oxford American Dictionary of Current English. 1999. Oxford: Oxford University Press.
  82. Pager Devah. 2007. Marked: Race, Crime, and Finding Work in an Era of Mass Incarceration. Chicago: University of Chicago Press. Google Scholar CrossRef
  83. Papachristos Andrew V., Hureau David M., Braga Anthony A. 2013. “The Corner and the Crew: The Influence of Geography and Social Networks on Gang Violence.” American Sociological Review 78(3):417–47. Google Scholar Link
  84. Pasquale Frank. 2014. The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge, MA: Harvard University Press. Google Scholar
  85. Perry Walter L., McInnis Brian, Price Carter C., Smith Susan C., Hollywood John S. 2013. “Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations.” RAND Safety and Justice Program. Santa Monica, CA. Retrieved July 6, 2017 (http://www.rand.org/content/dam/rand/pubs/research_reports/RR200/RR233/RAND_RR233.pdf). Google Scholar
  86. Porter Theodore M. 1995. Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton, NJ: Princeton University Press. Google Scholar
  87. Poster Mark. 1990. The Mode of Information. Chicago: University of Chicago Press. Google Scholar
  88. Quillian Lincoln, Pager Devah. 2001. “Black Neighbors, Higher Crime? The Role of Racial Stereotypes in Evaluations of Neighborhood Crime.” American Journal of Sociology 107(3):717–67. Google Scholar CrossRef
  89. Ratcliffe Jerry. 2008. Intelligence-Led Policing. Cullompton, UK: Willan Publishing. Google Scholar
  90. Reiss Albert J. 1971. The Police and the Public. New Haven, CT: Yale University Press. Google Scholar
  91. Renan Daphna. 2016. “The Fourth Amendment as Administrative Governance.” Stanford Law Review 68(5):1039–1129. Google Scholar
  92. Rios Victor M. 2011. Punished: Policing the Lives of Black and Latino Boys. New York: NYU Press. Google Scholar
  93. Roush Craig R. 2012. “Quis Custodiet Ipsos Custodes? Limits on Widespread Surveillance and Intelligence Gathering by Local Law Enforcement after 9/11.” Marquette Law Review 96(1):315–76. Google Scholar
  94. Rule James B. 1974. Private Lives and Public Surveillance: Social Control in the Computer Age. New York: Schocken Books. Google Scholar
  95. Rule James B. 2007. Privacy in Peril: How We Are Sacrificing a Fundamental Right in Exchange for Security and Convenience. New York: Oxford University Press. Google Scholar
  96. Sampson Robert J., Bartusch Dawn J. 1998. “Legal Cynicism and (Subcultural?) Tolerance of Deviance: The Neighborhood Context of Racial Differences.” Law and Society Review 32:777–804. Google Scholar CrossRef
  97. Scott Richard W. 1987. Organizations: Rational, Natural, and Open Systems. Englewood Cliffs, NJ: Prentice-Hall. Google Scholar
  98. Scott Richard W. 2004. “Reflections on a Half-Century of Organizational Sociology.” Annual Review of Sociology 30:1–21. Google Scholar CrossRef
  99. Sherman Lawrence W. 2013. “Targeting, Testing and Tracking Police Services: The Rise of Evidence-Based Policing, 1975–2025.” Crime and Justice 42(1):377–451. Google Scholar CrossRef
  100. Sherman Lawrence W., Gartin Patrick R., Buerger Michael E. 1989. “Hot Spots of Predatory Crime: Routine Activities and the Criminology of Place.” Criminology 27(1):27–55. Google Scholar CrossRef
  101. Skogan Wesley G. 2006. Police and Community in Chicago: A Tale of Three Cities. New York: Oxford University Press. Google Scholar
  102. Smith v. Maryland. 1979. 442 U.S. 735.
  103. Soss Joe, Fording Richard C., Schram Sanford F. 2011. Disciplining the Poor: Neoliberal Paternalism and the Persistent Power of Race. Chicago: University of Chicago Press. Google Scholar CrossRef
  104. Stuart Forrest. 2016. Down, Out & Under Arrest: Policing and Everyday Life in Skid Row. Chicago: University of Chicago Press. Google Scholar CrossRef
  105. Terry v. Ohio. 1968. 392 U.S. 1.
  106. Tracy Paul E., Morgan Vincent. 2000. “Big Brother and His Science Kit: DNA Databases for 21st Century Crime Control.” Journal of Criminal Law and Criminology 90(2):635–90. Google Scholar CrossRef, Medline
  107. Travis Jeremy, Western Bruce, Redburn Steve, eds. 2014. Growth of Incarceration in the United States: Exploring Causes and Consequences. Washington, DC: National Academy of Science. Google Scholar
  108. Uchida Craig D., Swatt Marc L. 2015. “Operation LASER and the Effectiveness of Hotspot Patrol: A Panel Analysis.” Police Quarterly 16(3):287–304. Google Scholar Link
  109. United States v. Jones. 2012. 132 S. Ct. 945, 565 U.S.
  110. United States v. Miller. 1939. 307 U.S. 174.
  111. U.S. Department of Justice. 2001 [2015]. “L.A. Consent Decree.” Washington, DC: U.S. Department of Justice.
  112. Wakefield Sara, Uggen Christopher. 2010. “Incarceration and Stratification.” Annual Review of Sociology 36:387–406. Google Scholar CrossRef
  113. Wakefield Sara, Wildeman Christopher. 2013. Children of the Prison Boom: Mass Incarceration and the Future of American Inequality. New York: Oxford University Press. Google Scholar CrossRef
  114. Waxman Matthew C. 2009. “Police and National Security: American Local Law Enforcement and Counter-Terrorism after 9/11.” Journal of National Security Law and Policy 3:377–407. Google Scholar
  115. Weber Max. 1978. Economy and Society: An Outline of Interpretive Sociology. Berkeley: University of California Press. Google Scholar
  116. Weisburd David, Mastrofski Stephen D., McNally Ann Marie, Greenspan Rosann, Willis James J. 2003. “Reforming to Preserve: COMPSTAT and Strategic Problem-Solving in American Policing.” Criminology and Public Policy 2:421–56. Google Scholar CrossRef
  117. Western Bruce. 2006. Punishment and Inequality in America. New York: Russell Sage Foundation. Google Scholar
  118. Western Bruce, Pettit Becky. 2005. “Black-White Wage Inequality, Employment Rates, and Incarceration.” American Journal of Sociology 111(2):553–78. Google Scholar CrossRef
  119. White House Police Data Initiative. 2015. “Launching the Police Data Initiative” (https://obamawhitehouse.archives.gov/blog/2015/05/18/launching-police-data-initiative). Google Scholar
  120. Whren v. United States. 1996. 517 U.S. 806.
  121. Willis James J., Mastrofski Stephen D., Weisburd David. 2007. “Making Sense of COMPSTAT: A Theory-Based Analysis of Organizational Change in Three Police Departments.” Law and Society Review 41(1):147–88. Google Scholar CrossRef
  122. Wilson James Q. 1968. Varieties of Police Behavior: The Management of Law and Order in Eight Communities. Cambridge, MA: Harvard University Press. Google Scholar

Incompatible: The GDPR in the Age of Big Data | Tal Zarsky

Tal Zarsky (Haifa); Incompatible: The GDPR in the Age of Big Data; Seton Hall Law Review, Vol. 47, No. 4(2), 2017; 2017-08-22; 26 pages; ssrn:3022646.

tl;dr → the opposition is elucidated and juxtaposed; the domain is problematized.
and → “Big Data,” by definition, is opportunistic and unsupervisable; it collects everything and identifies something later in the backend.  Else it is not “Big Data” (it is “little data,” which is known, familiar, boring, and of course has settled law surrounding its operational envelope).

Abstract

After years of drafting and negotiations, the EU finally passed the General Data Protection Regulation (GDPR). The GDPR’s impact will, most likely, be profound. Among the challenges data protection law faces in the digital age, the emergence of Big Data is perhaps the greatest. Indeed, Big Data analysis carries both hope and potential harm to the individuals whose data is analyzed, as well as other individuals indirectly affected by such analyses. These novel developments call for both conceptual and practical changes in the current legal setting.

Unfortunately, the GDPR fails to properly address the surge in Big Data practices. The GDPR’s provisions are — to borrow a key term used throughout EU data protection regulation — incompatible with the data environment that the availability of Big Data generates. Such incompatibility is destined to render many of the GDPR’s provisions quickly irrelevant. Alternatively, the GDPR’s enactment could substantially alter the way Big Data analysis is conducted, transferring it to one that is suboptimal and inefficient. It will do so while stalling innovation in Europe and limiting utility to European citizens, while not necessarily providing such citizens with greater privacy protection.

After a brief introduction (Part I), Part II quickly defines Big Data and its relevance to EU data protection law. Part III addresses four central concepts of EU data protection law as manifested in the GDPR: Purpose Specification, Data Minimization, Automated Decisions and Special Categories. It thereafter proceeds to demonstrate that the treatment of every one of these concepts in the GDPR is lacking and in fact incompatible with the prospects of Big Data analysis. Part IV concludes by discussing the aggregated effect of such incompatibilities on regulated entities, the EU, and society in general.

Rebuttal

<snide>Apparently this was not known before the activists captured the legislature and affected their ends with the force of law. Now we know. Yet we all must obey the law, as it stands and as it is written. And why was this not published in an EU-located law journal, perhaps one located in … Brussels?</snide>

Contents

I. INTRODUCTION AND ROAD MAP
II. A BRIEF PRIMER ON BIG DATA AND THE LAW
III. THE GDPR’S INCOMPATIBILITY – FOUR EXAMPLES
A. Purpose Limitation
B. Data Minimization
C. Special Categories
D. Automated Decisions
IV. CONCLUSION: WHAT’S NEXT FOR EUROPE

References

There are 123 references, manifested as footnotes in the legal style.

Separately noted.

The Spread of Mass Surveillance, 1995 to Present (Big Data Innovation Transfer and Governance in Emerging High Technology States) | CPS

Nadiya Kostyuk, Muzammil M. Hussain (CPD); The Spread of Mass Surveillance, 1995 to Present; In Their Blog at the Center for Political Studies (CPS), Institute for Social Research, University of Michigan; 2017-09-01.
Previously performed at the 2017 Annual Meeting of the American Political Science Association (APSA); the presentation, titled “Big Data Innovation Transfer and Governance in Emerging High Technology States” was a part of the session “The Role of Business in Information Technology and Politics” on Friday 2017-09-01.

tl;dr → an exercise in documentation; factoids are developed; a diversity is shown.
<quote>The observed cases in our study differ in scope and impact.</quote>

Original Sources

Mentions

  • Aadhaar, a national ID program, India.
  • Social Credit System, China.

Factoids

Categorical (arbitrary) Total Spend (USD) Spend/Individual (USD) Span (count) Coverage Universe Fun
nations worldwide $27.1B
(or more)
$7 4.138B 73% world population
stable autocracies,
authoritarian regimes
$10.967B $lower-$110 0.1B 81% their populations upper is 11X more
than “other regime type”
advanced democracies $8.909B $11 0.812B 74% their population
high-spending dictatorships and democracies,
developing and emerging democracies
$4.784B $1-2 2.875B 72% their population

Referenced

AI and ‘Enormous Data’ Could Make Tech Giants Harder to Topple | Wired

AI and ‘Enormous Data’ Could Make Tech Giants Harder to Topple; ; In Wired; 2017-07-13.

tl;dr → <quote>such releases don’t usually offer much of value to potential competitors. </quote> They are promotional and self-serving.

Occasion

Mentions

  • TensorFlow
  • Common Visual Data Foundation
    • open image data sets
    • A “nonprofit”
    • Sponsors
      • Facebook
      • Microsoft
  • Other data sets
    • from YouTube, by Google
    • from Wikipedia, by Salesforce

Scope

  • Google
  • Microsoft
  • and others!
    • Salesforce
    • Uber
  • Manifold, a boutique
  • Fast.ai, a boutique

Quoted

  • Luke de Oliveira
    • partner, Manifold
    • temp staff, visitor, Lawrence Berkeley National Lab
  • Abhinav Gupta, Carnegie Mellon University (CMU)
  • Rachel Thomas, cofounder, Fast.ai

Argot

Enormous Data
Are you kidding me? Do you even use computers?
incumbents’ usual data advantage
Buzzzzz!
innovative and un-monopolistic by disruption
Appears in the 1st paragraph

Referenced

The Wikitext Long Term Dependency Language Modeling Dataset; On Some Site

  • an announcement, but WHEN?

Previously

In Wired

 

The Princeton Web Transparency And Accountability Project | Narayanan, Reisman

Arvind Narayanan, Dillon Reisman; The Princeton Web Transparency and Accountability Project; In Tania Cerquitelli, Daniele Quercia, Frank Pasquale (editors); Transparent Data Mining for Big and Small Data; Springer; 2017.

tl;dr → There be dragons. Princeton was is there. Tell it! Testify!

Abstract

When you browse the web, hidden “third parties” collect a large amount of data about your behavior. This data feeds algorithms to target ads to you, tailor your news recommendations, and sometimes vary prices of online products. The network of trackers comprises hundreds of entities, but consumers have little awareness of its pervasiveness and sophistication. This chapter discusses the findings and experiences of the Princeton Web Transparency Project, which continually monitors the web to uncover what user data companies collect, how they collect it, and what they do with it. We do this via a largely automated monthly “census” of the top 1 million websites, in effect “tracking the trackers”. Our tools and findings have proven useful to regulators and investigatory journalists, and have led to greater public awareness, the cessation of some privacy-infringing practices, and the creation of new consumer privacy tools. But the work raises many new questions. For example, should we hold websites accountable for the privacy breaches caused by third parties? The chapter concludes with a discussion of such tricky issues and makes recommendations for public policy and regulation of privacy.

Mentions

  • Marvin Minsky
  • expert systems
  • Machine Learning
  • Artifical Intelligence
  • Big Data
  • Netflix
  • Self-Driving Cars
  • collect data first, ask questions later
  • surveillance infrastructure
  • Kafkaesque
  • data and algorithmic transparency
  • Workshop on Data and Algorithmic Transparency
  • Princeton Web Transparency and Accountability Project (WebTAP)
    Princeton Web Census
  • Privacy scholar
  • Ryan Calo
  • The Party System
  • first party
  • third party
  • Twitter
  • Facebook
  • Facebook Like Button
  • The Beauty and the Beast Project
  • Panopticlick
  • Anonymous
  • Pseudonymous
  • biases
  • discrimination
  • targeted political messaging
  • price discrimination
  • market manipulation
  • AdChoices
  • ad blockers
  • Federal Trade Commission (FTC)
  • Optimizely
  • A/B Testing
  • OpenWPM (Open Web Privacy Measurement)
  • FourthParth
  • FPDetective
  • PhamtomJS
  • Firefox
  • Tor
  • Facebook Connect
  • Google Single Sign-On (SSO)
  • longitudinal studies
  • HTML5, Canvas API
  • canvas fingerprinting
  • AddThis
  • AudioContext API
  • WebRTC API
  • Battery Status API
  • NSA (National Security Agency)
  • Snowden
  • Cookies
  • transitive cookie linking
  • HTTPS
  • cookie syncing
  • Google
  • Facebook
  • Federal Trade Commission (FTC)
  • yahoo.com
  • Cross-Device Tracking
  • header enrichment (by ISPs)
  • Ghostery
  • AdBlock Plus
  • uBlock Origin
  • machine learning classifier (for tracking behavior)
  • Big Data (they used Big Data and Machine Learning Classifiers)
  • Nudge (a book)
  • Choice Architecture
  • 3rd Part Cookies, blocking 3rd party cookies
  • Do Not Track
  • Battery API
  • Internet Explorer
  • zero sum game
  • power user interfaces
  • PGP (Pretty Good Privacy)
  • Cookie Blocking
  • <buzz>long tail (of innovation)</buzz>
  • Children’s Online Privacy Protection Act (COPPA)
  • child-directed websites.
  • American Civil Liberties Union (ACLU)
  • Computer Fraud and Abuse Act
  • Personally-Identifiable Information (PII)
  • shift of power, from 3rd parties to publishers
  • Columbia University
  • Carnegie Mellon University
  • Internet of Things (IoT)
  • WiFi
  • cross-device tracking
  • smartphone app
  • Fairness, Accountability and Transparency in Machine Learning (FAT-ML)
  • Princeton

Quoted

  • “The best minds of our generation are thinking about how to make people click on ads” attributed to Jeff Hammerbacher

References

  • Crevier D (1993) AI: The tumultuous history of the search for artificial intelligence. Basic Books, Inc.
  • Engle Jr RL, Flehinger BJ (1987) Why expert systems for medical diagnosis are not being generally used: a valedictory opinion. Bulletin of the New York Academy of Medicine 63(2):193
  • Vance A (2011) This tech bubble is different. Bloomberg
  • Angwin J (2016) Machine bias: Risk assessments in criminal sentencing. ProPublica
  • Levin S (2016) A beauty contest was judged by AI and the robots didn’t like dark skin. The Guardian
  • Solove DJ (2001) Privacy and power: Computer databases and metaphors for information privacy. Stanford Law Review pp 1393–1462
  • Marthews A, Tucker C (2015) Government surveillance and internet search behavior. ssrn:2412564
  • Hannak A, Soeller G, Lazer D, Mislove A, Wilson C (2014) Measuring price discrimination and steering on e-commerce web sites. In: Proceedings of the 2014 Conference on Internet Measurement Conference, ACM, pp 305–318
  • Calo R (2013) Digital market manipulation. University of Washington School of Law Research Paper 2013-27 DOI 10.2139/ssrn.2309703 ssrn:2309703
  • Mayer JR, Mitchell JC (2012) Third-party web tracking: Policy and technology. In: Proceedings of the 2012 IEEE Symposium on Security and Privacy, IEEE, pp 413–427
  • Angwin J (2010) The web’s new gold mine: Your secrets. The Wall Street Journal
  • Lerner A, Simpson AK, Kohno T, Roesner F (2016) Internet jones and the raiders of the lost trackers: An archaeological study of web tracking from 1996 to 2016. In: Proceedings of the 25th USENIX Security Symposium (USENIX Security 16)
  • Laperdrix P, Rudametkin W, Baudry B (2016) Beauty and the beast: Diverting modern web browsers to build unique browser fingerprints. In: Proceedings of the 37th IEEE Symposium on Security and Privacy (S&P 2016)
  • Eckersley P (2010) How unique is your web browser? In: International Symposium on Privacy Enhancing Technologies Symposium, Springer, pp 1–18
  • Acar G, Van Alsenoy B, Piessens F, Diaz C, Preneel B (2015) Facebook tracking through social plug-ins. Technical report prepared for the Belgian Privacy Commission
  • Starov O, Gill P, Nikiforakis N (2016) Are you sure you want to contact us? quantifying the leakage of pii via website contact forms. In: Proceedings on Privacy Enhancing Technologies 2016(1):20–33
  • Krishnamurthy B, Naryshkin K, Wills C (2011) Privacy leakage vs. protection measures: the growing disconnect. In: Proceedings of the Web, vol 2, pp 1–10
  • Su J, Shukla A, Goel S, Narayanan A (2017) De-anonymizing web browsing data with social networks, manuscript
  • Barocas S, Nissenbaum H (2014) Big data’s end run around procedural privacy protections. In Communications of the ACM 57-11:31-33
  • Shilton K, Greene D (2016) Because privacy: defining and legitimating privacy in ios development. In IConference 2016 Proceedings
  • Storey G, Reisman D, Mayer J, Narayanan A (2016) The future of ad blocking: Analytical framework and new techniques, manuscript
  • Narayanan A (2016) Can Facebook really make ads unblockable? In Freedom to Tinker
  • Storey G (2016) Facebook ad highlighter.
  • Reisman D (2016) A peek at A/B testing in the wild. In Freedom to Tinker
  • Acar G, Juarez M, Nikiforakis N, Diaz C, Gürses S, Piessens F, Preneel B (2013) Fpdetective: dusting the web for fingerprinters. In: Proceedings of the 2013 ACM SIGSAC Conference on Computer & Communications Security, ACM, pp 1129–1140
  • Englehardt S, Narayanan A (2016) Online tracking: A 1-million-site measurement and analysis. In: Proceedings of the 2016 ACM SIGSAC Conference on Computer & Communications Security
  • Selenium HQ (2016) Selenium browser automation faq.
  • Acar G, Eubank C, Englehardt S, Juarez M, Narayanan A, Diaz C (2014) The web never forgets. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (CCS ’14) DOI 10.1145/2660267.2660347
  • Mowery K, Shacham H (2012) Pixel perfect: Fingerprinting canvas in html5. In Proceedings of W2SP
  • (Valve) VV (2016) Fingerprintjs2 — modern & flexible browser fingerprinting library, a successor to the original fingerprintjs.
  • Olejnik Ł, Acar G, Castelluccia C, Diaz C (2015) The leaking battery. In: International Workshop on Data Privacy Management, Springer, pp 254–263
  • Englehardt S, Narayanan A (2016) Online tracking: A 1-million-site measurement and analysis. In Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security (CCS ’16)
  • Doty N (2016) Mitigating browser fingerprinting in web specifications.
  • Soltani A, Peterson A, Gellman B (2013) NSA uses Google cookies to pinpoint targets for hacking. In Washingtno Post.
  • Englehardt S, Reisman D, Eubank C, Zimmerman P, Mayer J, Narayanan A, Felten EW (2015) Cookies that give you away. In Proceedings of the 24th International Conference on World Wide Web (WWW ’15) DOI 10.1145/2736277.2741679
  • Angwin J (2016) Google has quietly dropped ban on personally identifiable web tracking. ProPublica
  • Reitman R (2012) What actually changed in Googles privacy policy. Electronic Frontier Foundation
  • Simonite T (2015) Facebooks like buttons will soon track your web browsing to target ads. MIT Technology Review
  • Federal Trade Commission (2015) Cross-device tracking.
  • Maggi F, Mavroudis V (2016) Talking behind your back attacks & countermeasures of ultrasonic cross-device tracking, In Proceedings of Blackhat
  • Angwin J (2014) Why online tracking is getting creepier. ProPublica
  • Vallina-Rodriguez N, Sundaresan S, Kreibich C, Paxson V (2015) Header enrichment or ISP enrichment? In Proceedings of the 2015 ACM SIGCOMM Workshop on Hot Topics in Middleboxes and Network Function Virtualization (HotMiddlebox ’15) DOI 10.1145/2785989.2786002
  • Disconnect (2016) Disconnect blocks new tracking device that makes your computer draw a unique image.
  • Foundation EF (2016) Privacy badger.
  • Thaler RH, Sunstein CR (2008) Nudge: improving decisions about health, wealth, and happiness. Yale University Press
  • Fleishman G (2015) Hands-on with content blocking safari extensions in ios 9. Macworld.
  • Blink, Chromium (2016) Owp storage team sync.
  • Lynch B (2012) Do not track in the windows 8 setup experience – microsoft on the issues. Microsoft on the Issues
  • Hern A (2016) Firefox disables loophole that allows sites to track users via battery status. The Guardian
  • Mozilla (2015) Tracking protection in private browsing.
  • Mozilla (2016) Security/contextual identity project/containers.
  • Federal Trade Commission (2012) Google will pay $22.5 million to settle FTC charges it misrepresented privacy assurances to users of apple’s safari internet browser.
  • Federal Trade Commission (2016) Children’s online privacy protection rule (“COPPA”).
  • New York State Office of the Attorney General (2016) A.G. schneiderman announces results of “operation child tracker,” ending illegal online tracking of children at some of nation’s most popular kids’ websites.
  • American Civil Liberties Union (2016) Sandvig v. Lynch.
  • Eubank C, Melara M, Perez-Botero D, Narayanan A (2013) Shining the floodlights on mobile web tracking a privacy survey.
  • CMU CHIMPS Lab (2015) Privacy grade: Grading the privacy of smartphone apps.
  • Vanrykel E, Acar G, Herrmann M, Diaz C (2016) Leaky birds: Exploiting mobile application traffic for surveillance. In Proceedings of Financial Cryptography and Data Security 2016
  • Lécuyer M, Ducoffe G, Lan F, Papancea A, Petsios T, Spahn R, Chaintreau A, Geambasu R (2014) Xray: Enhancing the webs transparency with differential correlation. In: Proceedings of the 23rd USENIX Security Symposium (USENIX Security 14), pp 49–64
  • Lecuyer M, Spahn R, Spiliopolous Y, Chaintreau A, Geambasu R, Hsu D (2015) Sunlight: Fine-grained targeting detection at scale with statistical confidence. In: Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communications Security, ACM, pp 554–566
  • Tschantz MC, Datta A, Datta A, Wing JM (2015) A methodology for information flow experiments. In: Proceedings of the 2015 IEEE 28th Computer Security Foundations Symposium, IEEE, pp 554–568
  • Datta A, Sen S, Zick Y (2016) Algorithmic transparency via quantitative input influence. In: Proceedings of 37th IEEE Symposium on Security and Privacy
  • Chen L, Mislove A, Wilson C (2015) Peeking beneath the hood of uber. In: Proceedings of the 2015 ACM Conference on Internet Measurement Conference, ACM, pp 495–508
  • Valentino-Devries J, Singer-Vine J, Soltani A (2012) Websites vary prices, deals based on users information. In The Wall Street Journal 10:60–68
  • Guide ASU (2016) Ui/application exerciser monkey.
  • Rastogi V, Chen Y, Enck W (2013) Appsplayground: automatic security analysis of smartphone applications. In: Proceedings of the third ACM Conference on Data and Application Security and Privacy, ACM, pp 209–220
  • Enck W, Gilbert P, Han S, Tendulkar V, Chun BG, Cox LP, Jung J, McDaniel P, Sheth AN (2014) Taintdroid: an information-flow tracking system for realtime privacy monitoring on smartphones. In ACM Transactions on Computer Systems (TOCS) 32(2):5
  • Ren J, Rao A, Lindorfer M, Legout A, Choffnes D (2015) Recon: Revealing and controlling privacy leaks in mobile network traffic. arXiv:1507.00255.
  • Razaghpanah A, Vallina-Rodriguez N, Sundaresan S, Kreibich C, Gill P, Allman M, Paxson V (2015) Haystack: in situ mobile traffic analysis in user space. arXiv:1510.01419.
  • Sweeney L (2013) Discrimination in online ad delivery. Queue 11(3):10
  • Caliskan-Islam A, Bryson J, Narayanan A (2016) Semantics derived auto-matically from language corpora necessarily contain human biases. arxiv:1608.07187.

 

 

How to Call B.S. on Big Data: A Practical Guide | New Yorker

How to Call B.S. on Big Data: A Practical Guide; ; In The New Yorker/ 2017-06-03.

Occasion

INFO 198/BIOL 106B (callingbullshit.org) – “Calling Bullshit in the Age of Big Data,” a course, University of Washington (Washington State, that is, located in Seattle WA). Instructors:  Jevin West (iinformation), Carl Bergstrom (biology)

Mentions

Referenced

Previously

In The New  Yorker

The Death of Rules and Standards | Casey, Niblett

Anthony J. Casey, Anthony Niblett; The Death of Rules and Standards; Coase-Sandor Working Paper Series in Law and Economics No. 738; Law School, University of Chicago; 2015; 58 pages; landing, copy, ssrn:2693826, draft.

Abstract

Scholars have examined the lawmakers’ choice between rules and standards for decades. This paper, however, explores the possibility of a new form of law that renders that choice unnecessary. Advances in technology (such as big data and artificial intelligence) will give rise to this new form – the micro-directive – which will provide the benefits of both rules and standards without the costs of either.

Lawmakers will be able to use predictive and communication technologies to enact complex legislative goals that are translated by machines into a vast catalog of simple commands for all possible scenarios. When an individual citizen faces a legal choice, the machine will select from the catalog and communicate to that individual the precise context-specific command (the micro-directive) necessary for compliance. In this way, law will be able to adapt to a wide array of situations and direct precise citizen behavior without further legislative or judicial action. A micro-directive, like a rule, provides a clear instruction to a citizen on how to comply with the law. But, like a standard, a micro-directive is tailored to and adapts to each and every context.

While predictive technologies such as big data have already introduced a trend toward personalized default rules, in this paper we suggest that this is only a small part of a larger trend toward context- specific laws that can adapt to any situation. As that trend continues, the fundamental cost trade-off between rules and standards will disappear, changing the way society structures and thinks about law.

Separately noted.