Attribution by Design, a meta-promotion

Referenced

Promotions

The Evolving Data Landscape: Veracity, Convergence And Anonymity | Ad Exchanger

Ramsey McGrory (Mediaocean); The Evolving Data Landscape: Veracity, Convergence And Anonymity; In AdExchanger; 2017-09-21.
Ramsey McGrory, chief revenue officer at Mediaocean

tl;dr → something about accuracy of imputations in consumer profiles, accuracy of “data.”

Original Sources

Ramsey McGrory (AddThis); The Data Providers One Quadrant Chart To Rule Them All; 2013-02.
tl;dr → it’s a metaphor with four (4) quadrants induced by a 2-axis “system”; later a 3rd access, a Z-axis

  1. online ↔ offline
  2. anonymous → personal
  3. singleton → conglomerate

Mentions

  • <quote>data being neither intrinsically “good” nor “bad,” but rather having “qualities.”<quote>, attributed to Ted McConnell.
  • behaviors, drive actions.
  • Viewability
  • Verification
  • Something allegorical about Viewability and Trust & Safety vending as a separable service of attestation, 2012 → 2017.
  • <quote>Viewability speaks to a broader metadata theme of trust, as well as an underlying theme of data quality and users’ engagement with content delivered against this data.</quote>
  • Hey! That’s not a business, that’s a Business Unit;
    Hey! That’s not a BU, that’s a Product.
    Hey! That’s not a Product, that’s a Feature.
    <quote>Then, these vertical standalone organizations and solutions were horizontally integrated into the operating agencies as capabilities.</quote>

Claimed

  • SafeGraph <quote>works with universities and health organizations to understand movement data and the spread of infectious diseases.</quote>
  • [all] device IDs are persistent
  • <quote>there are growing trends toward people taking control of their anonymization through the use of virtual private networks and Tor</quote>
    • As stated:
      • casual consumer use of VPNs is prevalent [enough to measure]
      • casual consumer use of Tor is prevalent [enough to measure]
    • Contrast with:
      <surely>IPv6 use is prevalent,
      IPv6 use is prevalent enough to warrant dual-stack interfaces on the great centralized ad exchanges.</surely>
  • <quote>mobile, where cookies can’t be used</quote>
  • <quote>that major brands may view agencies as differentiated commodity services, put their media in review with greater frequency and bid them down.</quote>
  • The adtech bubble is ongoing; adtech will be forward-funded on an ongoing basis:<quote><snip/> will continue to be funded with massive capital because the opportunities for innovation and disruption are huge.</quote>

Framework

Three Four V’s of Data
  1. volume
  2. velocity
  3. variety
  4. veracity

Exemplars

Big (conglomerates)
  • Adobe
  • Amazon
  • Google
  • IBM
  • Oracle
  • Salesforce
  • SAP
Cross-Device Fingerprinting
Viewability
  • AdSafe
  • comScore
  • DoubleVerify
  • Moat
Safety
  • Amino
  • White Ops
Data Brokers
  • Experian
  • Acxiom
  • TransUnion
  • Equifax
Trading Desks
  • Xaxis of WPP
  • Nerve Center of VivaKi of Publicis
Data Breached
  • Yahoo
  • Equifax
Salubrious

Hearts & Science
<honorific>won major accounts on a transparent, data-centric and deeply integrated vision.</honorific>

Who

  • Ted McConnell, practitioner.
    <quote>Ted McConnell, an independent consultant in the digital marketing space.</quote>

Referenced

Previously

In Ad Exchanger

Argot

The Suitecase Words
  • ”data truth”
  • “moat for data”
  • “truth of the inference”
  • intenders, as “auto intenders”
  • attitudes
  • demographics
  • measurement
  • cross-device,
    cross-device mapping.
  • deterministic
  • Television
    • connected television
    • addressable television
    • advanced television
    • data-enabled television
    • targeted television
    • integrated television and video,
      integrated television and video initiatives
  • strategic elements
    strategic elements of advertising campaigns.
  • holistic planning
  • anonymous data
  • digital data
  • ad block
  • cookie block
  • mobile,
    growth of mobile.
  • device IDs
  • persistent
  • anonymity,
    desire for anonymity.
  • breaches,
    data breaches,
    massive data breaches

    • Yahoo
    • Equifax
  • sensitive information
    • Social Security numbers
    • birthdates
    • credit card numbers
  • collaboration
  • competition
  • companies,
    services companies,
    technology and services companies
  • space,
    media space.
  • to verb… with large agencies
    • partner
    • coexist
    • compete
  • Z-axis
  • execute,
    acquisitively execute,
    aggressively and acquisitively execute,
    continue to aggressively and acquisitively execute,
    continue to aggressively and acquisitively execute on their strategies,
    continue to aggressively and acquisitively execute on their strategies to deliver on

    • infrastructure
    • data
    • services
  • agencies,
    holding company agencies.
  • solutions,
    data-driven solutions,
    converged, data-driven solutions.
  • vision,
    • transparent vision
    • data-centric vision
    • integrated vision,
      deeply integrated vision.
  • The side,
    • The downside
    • The upside
  • brands,
    major brands.
  • services,
    commodity services,
    differentiated commodity services,
    agencies as differentiated commodity services.
  • themes
    • convergence
    • data activation
    • people
  • change,
    great change,
    in a time of such great change,
    wait for it … wait for it … the only constant is change …thank you, thank you very much, I’ll be here all week.
  • The Bottom Line
  • In a world of…
  • adjectivedata,
    • first-party data
    • third-party data
    • personal data
    • census data
    • anonymous data
    • panel data,
      <mmmmm>…panel data..…</mmmmm>
    • pixel data
  • understanding,
    deeper understanding,
    deeper understanding of consumers’ …
    deeper understanding of consumers’ awareness and interests,
    deeper understanding of consumers’ awareness and interests while enjoying <snip/> profitability,
    deeper understanding of consumers’ awareness and interests while enjoying short- and long-term profitability,
    deeper understanding of consumers’ awareness and interests while enjoying short- and long-term profitability of their brands.
  • vision,
    this vision,
    delivering on this vision.
  • infrastructure,
    data infrastructure,
    extensive data infrastructure.
  • understanding,
    deep understanding,
    deep understanding of

    • advertising
    • publishing,
      media publishing
    • ecommerce
  • ecosystems,
    technology ecosystems,

    • advertising technology ecosystems
    • marketing technology ecosystems
    • content technology ecosystems
  • ecosystems,
    the ecosystems,
    all the ecosystems,
    And across all the ecosystems
  • nounof data
    • consolidation of data
    • standardization of data
    • interpretation of data
    • activation
  • winners
    winners and losers
    winners and losers will be decided.
  • transformation,
    massive transformation,
    enable massive transformation,
    enable massive transformation at <snip/> lower costs.
    enable massive transformation at materially lower costs.

Situating Methods in the Magic of Big Data and Artificial Intelligence | Elish, Boyd

M. C. Elish (Columbia); danah boyd (Microsoft); Situating Methods in the Magic of Big Data and Artificial Intelligence; In Communication Monographs, Forthcoming; 2017-09-21; 30 pages; ssrn:3040201

tl;dr → they’re doing it wrong; an explainer.
and → <quote>we problematize the myths that animate the supposed “magic” of these systems</quote>

Abstract

“Big Data” and “artificial intelligence” have captured the public imagination and are profoundly shaping social, economic, and political spheres. Through an interrogation of the histories, perceptions, and practices that shape these technologies, we problematize the myths that animate the supposed “magic” of these systems. In the face of an increasingly widespread blind faith in data-driven technologies, we argue for grounding machine learning-based practices and untethering them from hype and fear cycles. One path forward is to develop a rich methodological framework for addressing the strengths and weaknesses of doing data analysis. Through provocatively reimagining machine learning as computational ethnography, we invite practitioners to prioritize methodological reflection and recognize that all knowledge work is situated practice.

Argot

The Suitcase Words

Indeed.  Let’s see if the ‘bot can pick ‘em out.. The vein is deep and wide here.

Positive Truth

interrogation, problematize, myth, data-driven (empirical), grounding, practices, untethering, methodological framework, analysis, reimaginatively, provocatively reimagining, ethnography, computational ethnography, practitioners, reflection, methodological reflection, practice, situated practice, methods, situated methods.

Negative Truth (Stop Words)

and, have, the, an, of, that, these, we, in, an, for, from, one, is, to, a, through, as, we, all.

The Undue Influence of Surveillance Technology Companies on Policing | Elizabeth Joh

Elizabeth E. Joh; The Undue Influence of Surveillance Technology Companies on Policing; In Law Review Online, New York University (NYU); 2017-09; N pages; landing.
Elizabeth E. Joh is Professor of Law, School of Law, U.C. Davis.

Abstract

Conventional wisdom assumes that the police are in control of their investigative tools. But with surveillance technologies, this is not always the case. Increasingly, police departments are consumers of surveillance technologies that are created, sold, and controlled by private companies. These surveillance technology companies exercise an undue influence over the police today in ways that aren’t widely acknowledged, but that have enormous consequences for civil liberties and police oversight. Three seemingly unrelated examples—stingray cellphone surveillance, body cameras, and big data software—demonstrate varieties of this undue influence. The companies which provide these technologies act out of private self-interest, but their decisions have considerable public impact. The harms of this private influence include the distortion of Fourth Amendment law, the undermining of accountability by design, and the erosion of transparency norms. This Essay demonstrates the increasing degree to which surveillance technology vendors can guide, shape, and limit policing in ways that are not widely recognized. Any vision of increased police accountability today cannot be complete without consideration of the role surveillance technology companies play.

Contents

  1. INTRODUCTION
  2. I. EXAMPLES OF UNDUE INFLUENCE
    1. Stingray Cellphone Surveillance and Nondisclosure Agreements
      1. Nondisclosure Agreements
      2. Stingrays and the Fourth Amendment
      3. Secret Stingray Use
    2. Cornering the Market on Police Body Cameras
      1. When Product Design Is Policy
      2. Market Dominance
    3. Big Data Software and Proprietary Information
  3. II. THE HARMS OF UNDUE INFLUENCE
    1. Fourth Amendment Distortion
    2. Accountability by Design
    3. Outsourcing Suspicion and Obscuring Transparency
  4. III. MINIMIZING UNDUE INFLUENCE
    1. Local Surveillance Oversight
    2. Public Records Requests as Oversight
  5. CONCLUSION

Toward a Fourth Law of Robotics: Preserving Attribution, Responsibility, and Explainability in an Algorithmic Society | Pasquale

Frank A. Pasquale III; Toward a Fourth Law of Robotics: Preserving Attribution, Responsibility, and Explainability in an Algorithmic Society; Ohio State Law Journal, Vol. 78, 2017, U of Maryland Legal Studies Research Paper No. 2017-21; 2017-07-14; 13 pages; ssrn:3002546.

tl;dr → A comment for Balkin. To wit:
  1. Balkin should have supplied more context; such correction is supplied herewith.
  2. More expansive supervision is indicated; such expansion is supplied herewith.
  3. Another law is warranted; not a trinity, but perfection plus one more

Four Laws, here and previous:

  1. machine operators are always responsible for their machines.
  2. businesses are always responsible for their operators.
  3. machines must not pollute.
  4. A [machine] must always indicate the identity of its creator, controller, or owner.

Love the erudition; but this is just like planes, trains & automobiles.

Separately noted.

The Three Laws of Robotics in the Age of Big Data | Balkin

Jack M. Balkin  (Yale); The Three Laws of Robotics in the Age of Big Data; Ohio State Law Journal, Vol. 78, (2017), Forthcoming (real soon now, RSN), Yale Law School, Public Law Research Paper No. 592; 2016-12-29 → 2017-09-10; 45 pages; ssrn:2890965.

tl;dr → administrative laws [should be] directed at human beings and human organizations, not at [machines].

Laws

  1. machine operators are responsible
    [for the operations of their machines, always & everywhere]
  2. businesses are responsible
    [for the operation of their machines, always & everywhere]
  3. machines must not pollute
    [in a sense to be defined later: e.g. by a "tussle"]

Love the erudition; but none of this is new.

Separately noted.

Big Data Surveillance: The Case of Policing | Sarah Brayne

Abstract

This article examines the intersection of two structural developments: the growth of surveillance and the rise of “big data.” Drawing on observations and interviews conducted within the Los Angeles Police Department, I offer an empirical account of how the adoption of big data analytics does—and does not—transform police surveillance practices. I argue that the adoption of big data analytics facilitates amplifications of prior surveillance practices and fundamental transformations in surveillance activities.

  1. First, discretionary assessments of risk are supplemented and quantified using risk scores.
  2. Second, data are used for predictive, rather than reactive or explanatory, purposes.
  3. Third, the proliferation of automatic alert systems makes it possible to systematically surveil an unprecedentedly large number of people.
  4. Fourth, the threshold for inclusion in law enforcement databases is lower, now including individuals who have not had direct police contact.
  5. Fifth, previously separate data systems are merged, facilitating the spread of surveillance into a wide range of institutions.

Based on these findings, I develop a theoretical model of big data surveillance that can be applied to institutional domains beyond the criminal justice system. Finally, I highlight the social consequences of big data surveillance for law and social inequality.

Conclusions

Through a case study of the Los Angeles Police Department, this article analyzed the role of big data in surveillance practices. By socially situating big data, I examined why it was adopted, how it is used, and what the implications of its use are. Focusing on the interplay between surveillance practices, law, and technology offers new insights into social control and inequality. I argued that big data participates in and reflects existing social structures. Far from eliminating human discretion and bias, big data represents a new form of capital that is both a social product and a social resource. What data law enforcement collects, their methods for analyzing and interpreting it, and the way it informs their practice are all part of a fundamentally social process. Characterizing predictive models as “just math,” and fetishizing computation as an objective process, obscures the social side of algorithmic decision-making. Individuals’ interpretation of data occurs in preexisting institutional, legal, and social settings, and it is through that interpretive process that power dynamics come into play. Use of big data has the potential to ameliorate discriminatory practices, but these findings suggest implementation is of paramount importance. As organizational theory and literature from science and technology studies suggests, when new technology is overlaid onto an old organizational structure, longstanding problems shape themselves to the contours of the new technology, and new unintended consequences are generated. The process of transforming individual actions into “objective” data raises fundamentally sociological questions that this research only begins to address. In many ways, it transposes classic concerns from the sociology of quantification about simplification, decontextualization, and the privileging of measurable complex social phenomena onto the big data landscape. Surveillance is always ambiguous; it is implicated in both social inclusion and exclusion, and it creates both opportunities and constraints. The way in which surveillance helps achieve organizational goals and structure life chances may differ according to the individuals and institutions involved. Examining the means of big data surveillance across institutional domains is an open and timely line of inquiry, because once a new technology is disseminated in an institutional setting, it is difficult to scale back.

Mentions

  • Palantir
    • PredPol
    • Automatic License Plate Reader (ALPR)
    • Automatic Vehicle Locator (AVL)
    • Enterprise Master Person Index (EMPI)
  • Los Angeles Police Department (LAPD)
  • Big Data
  • risk scores
  • Smart Policiing Initiative
  • Department of Homeland Security (DHS)
  • Federal Bureau of Investigations (FBI)
  • Central Intelligence Agency (CIA)
  • Immigration and Customs Enforcement (ICE)
  • Los Angeles County Sheriff’s Department (LASD)
  • Fusion Center
  • JRIC
  • Real-Time Crime Analysis Center (RACR)
  • Crime Intelligence Detail (CID)
  • Operation LASER (Los Angeles’ Strategic Extraction and Restoration
  • Parole Compliance Unit
  • Field Fnterview (FI)
    program)

Argot

  • surveillant assemblage
  • trigger mechanism
  • in the system (to be ‘in the system’)
  • net widening
  • decontextualization
Origin
(at least)
  • Deleuze
  • Guattari
  • Marx; no: not that guy; the other one.
  • Pasquale

Quotes

<quote>Originally
intended for use in national defense, Palantir was initially partially funded by In-Q-Tel, the CIA’s venture capital firm. Palantir now has government and commercial customers, including the CIA, FBI, ICE, LAPD, NYPD, NSA, DHS, and J.P. Morgan. JRIC (the Southern California fusion center) started using Palantir in 2009, with the LAPD following shortly after. The use of Palantir has expanded rapidly through the Department, with regular training sessions and more divisions signing on each year. It has also spread throughout the greater L.A. region: in 2014, Palantir won the Request for Proposals to implement the statewide AB 109 administration program, which involves data integration and monitoring of the post-release community supervision population.</quote>

<quote>When I asked an officer to provide examples of why he stops people with high point values, he replied:

Yesterday this individual might have got stopped because he jaywalked. Today he mighta got stopped because he didn’t use his turn signal or whatever the case might be. So that’s two points . . . you could conduct an investigation or if something seems out of place you have your consensual stops.[7] So a pedestrian stop, this individual’s walking, “Hey, can I talk to you for a moment?” “Yeah what’s up?” You know, and then you just start filling out your card as he answers questions or whatever. And what it was telling us is who is out on the street, you know, who’s out there not neces- sarily maybe committing a crime but who’s active on the streets. You put the activity of . . . being in a street with maybe their violent background and one and one might create the next crime that’s gonna occur.

Promotions

References

  1. Angwin Julia. 2014. Dragnet Nation: A Quest for Privacy, Security, and Freedom in a World of Relentless Surveillance. New York: Times Books. Google Scholar
  2. Ball Kirstie, Webster Frank, eds. 2007. The Intensification of Surveillance: Crime, Terrorism & Warfare in the Information Age. London, UK: Pluto Press. Google Scholar
  3. Barley Stephen R. 1986. “Technology as an Occasion for Structuring: Evidence from Observations of CT Scanners and the Social Order of Radiology Departments.” Administrative Science Quarterly 31(1):78–108. Google Scholar CrossRef, Medline
  4. Barley Stephen R. 1996. “Technicians in the Workplace: Ethnographic Evidence for Bringing Work into Organization Studies.” Administrative Science Quarterly 41(3):404–441. Google Scholar CrossRef
  5. Barocas Solon, Selbst Andrew D. 2016. “Big Data’s Disparate Impact.” California Law Review 104:671–732. Google Scholar
  6. Becker Howard S. 1963. Outsiders: Studies in the Sociology of Deviance. New York: Free Press. Google Scholar
  7. Beckett Katherine, Nyrop Kris, Pfingst Lori, Bowen Melissa. 2005. “Drug Use, Drug Possession Arrests, and the Question of Race: Lessons from Seattle.” Social Problems (52)3:419–41. Google Scholar CrossRef
  8. Bittner Egon. 1967. “The Police on Skid-Row: A Study of Peace Keeping.” American Sociological Review 32(5):699–715. Google Scholar CrossRef
  9. Bonczar Thomas P., Herberman Erinn J. 2014. “Probation and Parole in the United States, 2013.” Washington, DC: Bureau of Justice Statistics.
  10. Bowker Geoffrey C., Star Susan Leigh. 2000. Sorting Things Out: Classification and Its Consequences. Cambridge, MA: MIT Press. Google Scholar
  11. boyd danah, Crawford Kate. 2012. “Critical Questions for Big Data: Provocations for a Cultural, Technological, and Scholarly Phenomenon.” Information, Communication & Society 15(5):662–79. Google Scholar CrossRef
  12. Braga Anthony A., Weisburd David L. 2010. Policing Problem Places: Crime Hot Spots and Effective Prevention. New York: Oxford University Press. Google Scholar CrossRef
  13. Braverman Harry. 1974. Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century. New York: Monthly Review Press. Google Scholar
  14. Brayne Sarah. 2014. “Surveillance and System Avoidance: Criminal Justice Contact and Institutional Attachment.” American Sociological Review 79(3): 367–91. Google Scholar Link
  15. Brown v. Plata. 2011. 563 U.S. 493.
  16. Browne Simone. 2015. Dark Matters: On the Surveillance of Blackness. Durham, NC: Duke University Press. Google Scholar CrossRef
  17. Carson E. Ann. 2015. “Prisoners in 2014.” Washington, DC: Bureau of Justice Statistics.
  18. Christin Angèle. 2016. “From Daguerreotypes to Algorithms: Machines, Expertise, and Three Forms of Objectivity.” ACM Computers & Society 46(1):27–32. Google Scholar CrossRef
  19. Cohen Stanley. 1985. Visions of Social Control: Crime, Punishment and Classification. Malden, MA: Polity Press. Google Scholar
  20. Deleuze Gilles, Guattari Felix. 1987. A Thousand Plateaus: Capitalism and Schizophrenia. Minneapolis: The University of Minnesota Press. Google Scholar
  21. DiMaggio Paul J., Powell Walter W. 1983. “The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields.” American Sociological Review 48(2):147–60. Google Scholar CrossRef
  22. Duster Troy. 1997. “Pattern, Purpose and Race in the Drug War.” Pp. 206–287 in Crack in America: Demon Drugs and Social Justice, edited by Reinarman C., Levine H. G. Berkeley: University of California Press. Google Scholar
  23. Duster Troy. 2005. “Race and Reification in Science.” Science 307:1050–51. Google Scholar CrossRef, Medline
  24. Epp Charles R., Maynard-Moody Steven, Haider-Markel Donald P. 2014. Pulled Over: How Police Stops Define Race and Citizenship. Chicago: University of Chicago Press. Google Scholar CrossRef
  25. Ericson Richard V., Haggerty Kevin D. 1997. Policing the Risk Society. Toronto: University of Toronto Press. Google Scholar
  26. Ericson Richard V., Haggerty Kevin D. 2006. The New Politics of Surveillance and Visibility. Toronto: University of Toronto Press. Google Scholar
  27. Espeland Wendy N., Vannebo Berit I. 2007. “Accountability, Quantification and Law.” Annual Review of Law and Society 3:21–43. Google Scholar CrossRef
  28. Executive Office of the President. 2014. “Big Data: Seizing Opportunities, Preserving Values.” Washington, DC: The White House.
  29. Feeley Malcolm M., Simon Jonathan. 1992. “The New Penology: Notes on the Emerging Strategy of Corrections and Its Implications.” Criminology 30(4):449–74. Google Scholar CrossRef
  30. Ferguson Andrew G. 2015. “Big Data and Predictive Reasonable Suspicion.” University of Pennsylvania Law Review 63(2):327–410. Google Scholar
  31. Fiske John. 1998. “Surveilling the City: Whiteness, the Black Man and Democratic Totalitarianism.” Theory, Culture and Society 15(2):67–88. Google Scholar Link
  32. Fiske Susan T., Taylor Shelley E. 1991. Social Cognition, 2nd ed. New York: McGraw-Hill. Google Scholar
  33. Foucault Michel. 1977. Discipline and Punish: The Birth of the Prison. New York: Random House. Google Scholar
  34. Fourcade Marion, Healy Kieran. 2013. “Classification Situations: Life-Chances in the Neoliberal Era.” Accounting, Organizations and Society 38:559–72. Google Scholar CrossRef
  35. Fourcade Marion, Healy Kieran. 2017. “Seeing like a Market.” Socioeconomic Review 15(1):9–29. Google Scholar
  36. Gandy Oscar H. 1993. The Panoptic Sort: A Political Economy of Personal Information. Boulder, CO: Westview Press. Google Scholar
  37. Gandy Oscar H. 2002. “Data Mining and Surveillance in the Post-9.11 Environment.” Presentation to the Political Economy Section, International Association for Media and Communication Research.
  38. Gandy Oscar H. 2009. Coming to Terms with Chance: Engaging Rational Discrimination and Cumulative Disadvantage. Farnham, UK: Ashgate Publishing. Google Scholar
  39. Garland David. 2001. The Culture of Control: Crime and Social Order in Contemporary Society. New York: Oxford University Press. Google Scholar
  40. Giddens Anthony. 1990. The Consequences of Modernity. Stanford, CA: Stanford University Press. Google Scholar
  41. Gilliom John. 2001. Overseers of the Poor: Surveillance, Resistance, and the Limits of Privacy. Chicago: University of Chicago Press. Google Scholar
  42. Gitelman Lisa, ed. 2013. Raw Data Is an Oxymoron. Cambridge, MA: MIT Press. Google Scholar
  43. Goffman Alice. 2014. On the Run: Fugitive Life in an American City. Chicago: University of Chicago Press. Google Scholar CrossRef
  44. Goffman Erving. 1963. Stigma: Notes on the Management of Spoiled Identity. Englewood Cliffs, NJ: Prentice-Hall. Google Scholar
  45. Gross Samuel R., Possley Maurice, Stephens Kalara. 2017. “Race and Wrongful Convictions in the United States.” National Registry of Exonerations, Newkirk Center for Science and Society, University of California-Irvine.
  46. Gustafson Kaaryn S. 2011. Cheating Welfare: Public Assistance and the Criminalization of Poverty. New York: NYU Press. Google Scholar CrossRef
  47. Guzik Keith. 2009. “Discrimination by Design: Predictive Data Mining as Security Practice in the United States’ ‘War on Terrorism.’” Surveillance and Society 7(1):1–17. Google Scholar
  48. Hacking Ian. 1990. The Taming of Chance. Cambridge, UK: Cambridge University Press. Google Scholar CrossRef
  49. Haggerty Kevin D., Ericson Richard V. 2000. “The Surveillant Assemblage.” British Journal of Sociology 51(4):605–622. Google Scholar CrossRef, Medline
  50. Harcourt Bernard E. 2006. Against Prediction: Profiling, Policing, and Punishing in an Actuarial Age. Chicago: University of Chicago Press. Google Scholar CrossRef
  51. Hindmarsh Richard, Prainsack Barbara, eds. 2010. Genetic Suspects: Global Governance of Forensic DNA Profiling and Databasing. Cambridge, UK: Cambridge University Press. Google Scholar CrossRef
  52. Innes Martin. 2001. “Control Creep.” Sociological Research Online 6:1–10. Google Scholar CrossRef
  53. Joh Elizabeth E. 2016. “The New Surveillance Discretion: Automated Suspicion, Big Data, and Policing.” Harvard Law and Policy Review 10(1):15–42. Google Scholar
  54. Kitchin Rob. 2014. The Data Revolution: Big Data, Open Data, Data Infrastructures & Their Consequences. London, UK: SAGE. Google Scholar CrossRef
  55. Kling Rob. 1991. “Computerization and Social Transformations.” Science, Technology & Human Values 16(3):342–67. Google Scholar Link
  56. Kohler-Hausmann Issa. 2013. “Misdemeanor Justice: Control without Conviction.” American Journal of Sociology 119(2):351–93. Google Scholar CrossRef
  57. Laney Doug. 2001. “3D Data Management: Controlling Data Volume, Velocity, and Variety.” Stamford, CT: META Group.
  58. Langton Lynn, Berzofsky Marcus, Krebs Christopher, Smiley-McDonald Hope. 2012. “Victimizations Not Reported to the Police, 2006-2010.” Washington, DC: Bureau of Justice Statistics.
  59. Laub John H. 2014. “Understanding Inequality and the Justice System Response: Charting a New Way Forward.” New York: William T. Grant Foundation.
  60. Lazer David, Radford Jason. 2017. “Data ex Machina: Introduction to Big Data.” Annual Review of Sociology 43:19–39. Google Scholar CrossRef
  61. Los Angeles Police Department. 2017. “Sworn Personnel by Rank, Gender, and Ethnicity (SPRGE) Report” (http://www.lapdonline.org/sworn_and_civilian_report). Google Scholar
  62. Lynch Michael, Cole Simon A., McNally Ruth, Jordan Kathleen. 2008. Truth Machine: The Contentious History of DNA Fingerprinting. Chicago: University of Chicago Press. Google Scholar CrossRef
  63. Lyon David. 1994. The Electronic Eye: The Rise of Surveillance Society. Minneapolis: University of Minnesota Press. Google Scholar
  64. Lyon David, ed. 2003. Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination. New York: Routledge. Google Scholar
  65. Lyon David, ed. 2006. Theorizing Surveillance: The Panopticon and Beyond. New York: Polity. Google Scholar
  66. Lyon David. 2015. Surveillance after Snowden. New York: Polity. Google Scholar
  67. MacKenzie Donald, Muniesa Fabian, Siu Lucia, eds. 2007. Do Economists Make Markets? On the Performativity of Economics. Princeton, NJ: Princeton University Press. Google Scholar
  68. Manning Peter K. 2011. The Technology of Policing: Crime Mapping, Information Technology, and the Rationality of Crime Control. New York: NYU Press. Google Scholar
  69. Manning Peter K., Van Maanen John, eds. 1978. Policing: A View from the Street. New York: Random House. Google Scholar
  70. Marx Gary T. 1974. “Thoughts on a Neglected Category of Social Movement Participant: The Agent Provocateur and the Informant.” American Journal of Sociology 80(2):402–442. Google Scholar CrossRef
  71. Marx Gary T. 1988. Undercover: Police Surveillance in America. Berkeley: University of California Press. Google Scholar
  72. Marx Gary T. 1998. “Ethics for the New Surveillance.” The Information Society 14(3):171–85. Google Scholar CrossRef
  73. Marx Gary T. 2002. “What’s New About the ‘New Surveillance’? Classifying for Change and Continuity.” Surveillance and Society 1(1):9–29. Google Scholar
  74. Marx Gary T. 2016. Windows into the Soul: Surveillance and Society in an Age of High Technology. Chicago: University of Chicago Press. Google Scholar CrossRef
  75. Mayer-Schönberger Viktor, Cukier Kenneth. 2013. Big Data: A Revolution That Will Transform How We Live, Work, and Think. New York: Houghton Mifflin Harcourt. Google Scholar
  76. Merton Robert K. 1948. “The Self-Fulfilling Prophecy.” The Antioch Review 8(2):193–210. Google Scholar CrossRef
  77. Meyer John W., Rowan Brian. 1977. “Institutionalized Organizations: Formal Structure as Myth and Ceremony.” American Journal of Sociology 83(2):340–63. Google Scholar CrossRef
  78. Mohler George O., Short Martin B., Malinowski Sean, Johnson Mark, Tita George E., Bertozzi Andrea L., Brantingham P. Jeffrey. 2015. “Randomized Controlled Field Trials of Predictive Policing.” Journal of the American Statistical Association 110:1399–1411. Google Scholar CrossRef
  79. Monahan Torin, Palmer Neal A. 2009. “The Emerging Politics of DHS Fusion Centers.” Security Dialogue 40(6):617–36. Google Scholar Link
  80. Moskos Peter. 2008. Cop in the Hood: My Year Policing Baltimore’s Eastern District. Princeton, NJ: Princeton University Press. Google Scholar
  81. Oxford American Dictionary of Current English. 1999. Oxford: Oxford University Press.
  82. Pager Devah. 2007. Marked: Race, Crime, and Finding Work in an Era of Mass Incarceration. Chicago: University of Chicago Press. Google Scholar CrossRef
  83. Papachristos Andrew V., Hureau David M., Braga Anthony A. 2013. “The Corner and the Crew: The Influence of Geography and Social Networks on Gang Violence.” American Sociological Review 78(3):417–47. Google Scholar Link
  84. Pasquale Frank. 2014. The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge, MA: Harvard University Press. Google Scholar
  85. Perry Walter L., McInnis Brian, Price Carter C., Smith Susan C., Hollywood John S. 2013. “Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations.” RAND Safety and Justice Program. Santa Monica, CA. Retrieved July 6, 2017 (http://www.rand.org/content/dam/rand/pubs/research_reports/RR200/RR233/RAND_RR233.pdf). Google Scholar
  86. Porter Theodore M. 1995. Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton, NJ: Princeton University Press. Google Scholar
  87. Poster Mark. 1990. The Mode of Information. Chicago: University of Chicago Press. Google Scholar
  88. Quillian Lincoln, Pager Devah. 2001. “Black Neighbors, Higher Crime? The Role of Racial Stereotypes in Evaluations of Neighborhood Crime.” American Journal of Sociology 107(3):717–67. Google Scholar CrossRef
  89. Ratcliffe Jerry. 2008. Intelligence-Led Policing. Cullompton, UK: Willan Publishing. Google Scholar
  90. Reiss Albert J. 1971. The Police and the Public. New Haven, CT: Yale University Press. Google Scholar
  91. Renan Daphna. 2016. “The Fourth Amendment as Administrative Governance.” Stanford Law Review 68(5):1039–1129. Google Scholar
  92. Rios Victor M. 2011. Punished: Policing the Lives of Black and Latino Boys. New York: NYU Press. Google Scholar
  93. Roush Craig R. 2012. “Quis Custodiet Ipsos Custodes? Limits on Widespread Surveillance and Intelligence Gathering by Local Law Enforcement after 9/11.” Marquette Law Review 96(1):315–76. Google Scholar
  94. Rule James B. 1974. Private Lives and Public Surveillance: Social Control in the Computer Age. New York: Schocken Books. Google Scholar
  95. Rule James B. 2007. Privacy in Peril: How We Are Sacrificing a Fundamental Right in Exchange for Security and Convenience. New York: Oxford University Press. Google Scholar
  96. Sampson Robert J., Bartusch Dawn J. 1998. “Legal Cynicism and (Subcultural?) Tolerance of Deviance: The Neighborhood Context of Racial Differences.” Law and Society Review 32:777–804. Google Scholar CrossRef
  97. Scott Richard W. 1987. Organizations: Rational, Natural, and Open Systems. Englewood Cliffs, NJ: Prentice-Hall. Google Scholar
  98. Scott Richard W. 2004. “Reflections on a Half-Century of Organizational Sociology.” Annual Review of Sociology 30:1–21. Google Scholar CrossRef
  99. Sherman Lawrence W. 2013. “Targeting, Testing and Tracking Police Services: The Rise of Evidence-Based Policing, 1975–2025.” Crime and Justice 42(1):377–451. Google Scholar CrossRef
  100. Sherman Lawrence W., Gartin Patrick R., Buerger Michael E. 1989. “Hot Spots of Predatory Crime: Routine Activities and the Criminology of Place.” Criminology 27(1):27–55. Google Scholar CrossRef
  101. Skogan Wesley G. 2006. Police and Community in Chicago: A Tale of Three Cities. New York: Oxford University Press. Google Scholar
  102. Smith v. Maryland. 1979. 442 U.S. 735.
  103. Soss Joe, Fording Richard C., Schram Sanford F. 2011. Disciplining the Poor: Neoliberal Paternalism and the Persistent Power of Race. Chicago: University of Chicago Press. Google Scholar CrossRef
  104. Stuart Forrest. 2016. Down, Out & Under Arrest: Policing and Everyday Life in Skid Row. Chicago: University of Chicago Press. Google Scholar CrossRef
  105. Terry v. Ohio. 1968. 392 U.S. 1.
  106. Tracy Paul E., Morgan Vincent. 2000. “Big Brother and His Science Kit: DNA Databases for 21st Century Crime Control.” Journal of Criminal Law and Criminology 90(2):635–90. Google Scholar CrossRef, Medline
  107. Travis Jeremy, Western Bruce, Redburn Steve, eds. 2014. Growth of Incarceration in the United States: Exploring Causes and Consequences. Washington, DC: National Academy of Science. Google Scholar
  108. Uchida Craig D., Swatt Marc L. 2015. “Operation LASER and the Effectiveness of Hotspot Patrol: A Panel Analysis.” Police Quarterly 16(3):287–304. Google Scholar Link
  109. United States v. Jones. 2012. 132 S. Ct. 945, 565 U.S.
  110. United States v. Miller. 1939. 307 U.S. 174.
  111. U.S. Department of Justice. 2001 [2015]. “L.A. Consent Decree.” Washington, DC: U.S. Department of Justice.
  112. Wakefield Sara, Uggen Christopher. 2010. “Incarceration and Stratification.” Annual Review of Sociology 36:387–406. Google Scholar CrossRef
  113. Wakefield Sara, Wildeman Christopher. 2013. Children of the Prison Boom: Mass Incarceration and the Future of American Inequality. New York: Oxford University Press. Google Scholar CrossRef
  114. Waxman Matthew C. 2009. “Police and National Security: American Local Law Enforcement and Counter-Terrorism after 9/11.” Journal of National Security Law and Policy 3:377–407. Google Scholar
  115. Weber Max. 1978. Economy and Society: An Outline of Interpretive Sociology. Berkeley: University of California Press. Google Scholar
  116. Weisburd David, Mastrofski Stephen D., McNally Ann Marie, Greenspan Rosann, Willis James J. 2003. “Reforming to Preserve: COMPSTAT and Strategic Problem-Solving in American Policing.” Criminology and Public Policy 2:421–56. Google Scholar CrossRef
  117. Western Bruce. 2006. Punishment and Inequality in America. New York: Russell Sage Foundation. Google Scholar
  118. Western Bruce, Pettit Becky. 2005. “Black-White Wage Inequality, Employment Rates, and Incarceration.” American Journal of Sociology 111(2):553–78. Google Scholar CrossRef
  119. White House Police Data Initiative. 2015. “Launching the Police Data Initiative” (https://obamawhitehouse.archives.gov/blog/2015/05/18/launching-police-data-initiative). Google Scholar
  120. Whren v. United States. 1996. 517 U.S. 806.
  121. Willis James J., Mastrofski Stephen D., Weisburd David. 2007. “Making Sense of COMPSTAT: A Theory-Based Analysis of Organizational Change in Three Police Departments.” Law and Society Review 41(1):147–88. Google Scholar CrossRef
  122. Wilson James Q. 1968. Varieties of Police Behavior: The Management of Law and Order in Eight Communities. Cambridge, MA: Harvard University Press. Google Scholar

Incompatible: The GDPR in the Age of Big Data | Tal Zarsky

Tal Zarsky (Haifa); Incompatible: The GDPR in the Age of Big Data; Seton Hall Law Review, Vol. 47, No. 4(2), 2017; 2017-08-22; 26 pages; ssrn:3022646.

tl;dr → the opposition is elucidated and juxtaposed; the domain is problematized.
and → “Big Data,” by definition, is opportunistic and unsupervisable; it collects everything and identifies something later in the backend.  Else it is not “Big Data” (it is “little data,” which is known, familiar, boring, and of course has settled law surrounding its operational envelope).

Abstract

After years of drafting and negotiations, the EU finally passed the General Data Protection Regulation (GDPR). The GDPR’s impact will, most likely, be profound. Among the challenges data protection law faces in the digital age, the emergence of Big Data is perhaps the greatest. Indeed, Big Data analysis carries both hope and potential harm to the individuals whose data is analyzed, as well as other individuals indirectly affected by such analyses. These novel developments call for both conceptual and practical changes in the current legal setting.

Unfortunately, the GDPR fails to properly address the surge in Big Data practices. The GDPR’s provisions are — to borrow a key term used throughout EU data protection regulation — incompatible with the data environment that the availability of Big Data generates. Such incompatibility is destined to render many of the GDPR’s provisions quickly irrelevant. Alternatively, the GDPR’s enactment could substantially alter the way Big Data analysis is conducted, transferring it to one that is suboptimal and inefficient. It will do so while stalling innovation in Europe and limiting utility to European citizens, while not necessarily providing such citizens with greater privacy protection.

After a brief introduction (Part I), Part II quickly defines Big Data and its relevance to EU data protection law. Part III addresses four central concepts of EU data protection law as manifested in the GDPR: Purpose Specification, Data Minimization, Automated Decisions and Special Categories. It thereafter proceeds to demonstrate that the treatment of every one of these concepts in the GDPR is lacking and in fact incompatible with the prospects of Big Data analysis. Part IV concludes by discussing the aggregated effect of such incompatibilities on regulated entities, the EU, and society in general.

Rebuttal

<snide>Apparently this was not known before the activists captured the legislature and affected their ends with the force of law. Now we know. Yet we all must obey the law, as it stands and as it is written. And why was this not published in an EU-located law journal, perhaps one located in … Brussels?</snide>

Contents

I. INTRODUCTION AND ROAD MAP
II. A BRIEF PRIMER ON BIG DATA AND THE LAW
III. THE GDPR’S INCOMPATIBILITY – FOUR EXAMPLES
A. Purpose Limitation
B. Data Minimization
C. Special Categories
D. Automated Decisions
IV. CONCLUSION: WHAT’S NEXT FOR EUROPE

References

There are 123 references, manifested as footnotes in the legal style.

Separately noted.

The Spread of Mass Surveillance, 1995 to Present (Big Data Innovation Transfer and Governance in Emerging High Technology States) | CPS

Nadiya Kostyuk, Muzammil M. Hussain (CPD); The Spread of Mass Surveillance, 1995 to Present; In Their Blog at the Center for Political Studies (CPS), Institute for Social Research, University of Michigan; 2017-09-01.
Previously performed at the 2017 Annual Meeting of the American Political Science Association (APSA); the presentation, titled “Big Data Innovation Transfer and Governance in Emerging High Technology States” was a part of the session “The Role of Business in Information Technology and Politics” on Friday 2017-09-01.

tl;dr → an exercise in documentation; factoids are developed; a diversity is shown.
<quote>The observed cases in our study differ in scope and impact.</quote>

Original Sources

Mentions

  • Aadhaar, a national ID program, India.
  • Social Credit System, China.

Factoids

Categorical (arbitrary) Total Spend (USD) Spend/Individual (USD) Span (count) Coverage Universe Fun
nations worldwide $27.1B
(or more)
$7 4.138B 73% world population
stable autocracies,
authoritarian regimes
$10.967B $lower-$110 0.1B 81% their populations upper is 11X more
than “other regime type”
advanced democracies $8.909B $11 0.812B 74% their population
high-spending dictatorships and democracies,
developing and emerging democracies
$4.784B $1-2 2.875B 72% their population

Referenced

AI and ‘Enormous Data’ Could Make Tech Giants Harder to Topple | Wired

AI and ‘Enormous Data’ Could Make Tech Giants Harder to Topple; ; In Wired; 2017-07-13.

tl;dr → <quote>such releases don’t usually offer much of value to potential competitors. </quote> They are promotional and self-serving.

Occasion

Mentions

  • TensorFlow
  • Common Visual Data Foundation
    • open image data sets
    • A “nonprofit”
    • Sponsors
      • Facebook
      • Microsoft
  • Other data sets
    • from YouTube, by Google
    • from Wikipedia, by Salesforce

Scope

  • Google
  • Microsoft
  • and others!
    • Salesforce
    • Uber
  • Manifold, a boutique
  • Fast.ai, a boutique

Quoted

  • Luke de Oliveira
    • partner, Manifold
    • temp staff, visitor, Lawrence Berkeley National Lab
  • Abhinav Gupta, Carnegie Mellon University (CMU)
  • Rachel Thomas, cofounder, Fast.ai

Argot

Enormous Data
Are you kidding me? Do you even use computers?
incumbents’ usual data advantage
Buzzzzz!
innovative and un-monopolistic by disruption
Appears in the 1st paragraph

Referenced

The Wikitext Long Term Dependency Language Modeling Dataset; On Some Site

  • an announcement, but WHEN?

Previously

In Wired