Regulate Facebook Like AIM | Motherboard

Regulate Facebook Like AIM; Louise Matsakis; In Motherboard; 2017-10-06.
Teaser: In 2001, the FCC forced AIM to become compatible with other chat platforms. We should do the same to Facebook.

tl;dr → A modest proposal; for the lulz and the page views.
and → Facebook is bad.

Occasion

[Previously in Motherboard] AOL Messenger shuts down 2017-12-15.; [recently]

Mentions

  • AOL Instant Messenger (AIM)
    • Shutdown 2017-12-15.
  • ICQ
  • FCC
  • America On Line (AOL)
  • Time Warner

Quoted

For color, background & verisimilitude…

The Undue Influence of Surveillance Technology Companies on Policing | Elizabeth Joh

Elizabeth E. Joh; The Undue Influence of Surveillance Technology Companies on Policing; In Law Review Online, New York University (NYU); 2017-09; N pages; landing.
Elizabeth E. Joh is Professor of Law, School of Law, U.C. Davis.

Abstract

Conventional wisdom assumes that the police are in control of their investigative tools. But with surveillance technologies, this is not always the case. Increasingly, police departments are consumers of surveillance technologies that are created, sold, and controlled by private companies. These surveillance technology companies exercise an undue influence over the police today in ways that aren’t widely acknowledged, but that have enormous consequences for civil liberties and police oversight. Three seemingly unrelated examples—stingray cellphone surveillance, body cameras, and big data software—demonstrate varieties of this undue influence. The companies which provide these technologies act out of private self-interest, but their decisions have considerable public impact. The harms of this private influence include the distortion of Fourth Amendment law, the undermining of accountability by design, and the erosion of transparency norms. This Essay demonstrates the increasing degree to which surveillance technology vendors can guide, shape, and limit policing in ways that are not widely recognized. Any vision of increased police accountability today cannot be complete without consideration of the role surveillance technology companies play.

Contents

  1. INTRODUCTION
  2. I. EXAMPLES OF UNDUE INFLUENCE
    1. Stingray Cellphone Surveillance and Nondisclosure Agreements
      1. Nondisclosure Agreements
      2. Stingrays and the Fourth Amendment
      3. Secret Stingray Use
    2. Cornering the Market on Police Body Cameras
      1. When Product Design Is Policy
      2. Market Dominance
    3. Big Data Software and Proprietary Information
  3. II. THE HARMS OF UNDUE INFLUENCE
    1. Fourth Amendment Distortion
    2. Accountability by Design
    3. Outsourcing Suspicion and Obscuring Transparency
  4. III. MINIMIZING UNDUE INFLUENCE
    1. Local Surveillance Oversight
    2. Public Records Requests as Oversight
  5. CONCLUSION

Advertising Trade Groups Object to Safari’s New Intelligent Tracking Protection | John Gruber

John Gruber; Daring Fireball: Advertising Trade Groups Object to Safari’s New Intelligent Tracking Protection; In His Blog; 2017-09-16.

tl;dr → the gloating. Big Bad Apple pulled one over on Big Bad AdTech.  Gruber a Cupertino boosterist. Quotes from the Microsoft pantheon are exhibited.

Occasion

Every Major Advertising Group Is Blasting Apple for Blocking Cookies in the Safari Browser; ; In Ad Week; 2017-09-14.
Teaser: They argue it’ll hurt user experience and campaign targeting

Mentions

  • Intelligent Tracking Prevention (ITP)
  • Interactive Advertising Bureau (IAB)
  • American Advertising Federation (AAF)
  • Association of National Advertisers (ANA)
  • Trade Group Name with Four ‘A’s in it (4A)
    American Association of Advanced Advertisers (AAAA, 4A)
    American Association of Advertising Agencies (4As)

Who

 

The OODA loop of Trump’s Insurgency has been Smashed | John Robb

John Robb; The OODA loop of Trump’s Insurgency has been Smashed; In His Blog; 2017-09-15.

tl;dr → Wherein the most powerful thing a man can do is to foretell the future, for that makes him a prophet and peer unto The Prophet, and unto God himself.  John Robb prophesied, back in the day, way back in 2016-Q1; and now he’s telling us, reminding us.  He had foretold unto us, in a newsletter, in a blog.
and → The sooth is sayed.  A report out: based on information, belief, analysis, aspiration & Insight; wherein General John Kelly, in his role as Chief of Staff, is restricting access to Donald J. Trump; this is insinuated to be a coup by any other name, as such by The Military [Industrial Complex]; Donald J. Trump, in response, is no longer slaking his prurient interest.  DJT is not now, nor ever was, the adult in the room.

Argot

The Suitcase Words
  • Observe, Orient, Decide, Act (OODA)
  • John Boyd
  • Insurgency, The Insurgency

Pascale promotes three performances among Bracha, Pasquale, Calo & Tutt on the need for the Regulation of Computing

tl;dr → Sounding the alarum; The Signal is Given! Technology! Computers! Control Them! Bad! Stop Them! The panic is upon us!  Mend it! Don’t End It!

Via: A tweet, of @FrankPasquale

Elaborated

Oren Bracha, Frank Pasquale WHEN?

Oren Bracha, Frank Pasquale; Federal Search Commission? Access, Fairness, And Accountability In The Law Of Search; In Cornell Law Review, Volume 93 WHEN?; 62 pages (pages 1149→1149+62).

Abstract

Should search engines be subject to the types of regulation now applied to personal data collectors, cable networks, or phone books? In this Article, we make the case for some regulation of the ability of search engines to manipulate and structure their results. We demonstrate that the First Amendment, properly understood, does not prohibit such regulation. Nor will such intervention inevitably lead to the disclosure of important trade secrets. After setting forth normative foundations for evaluating search engine manipulation, we explain how neither market discipline nor technological advance is likely to stop it. Though savvy users and personalized search may constrain abusive companies to some extent, they have little chance of checking untoward behavior by the oligopolists who now dominate the search market. Arguing against the trend among courts to declare search results unregulable speech, this Article makes a case for an ongoing conversation on search engine regulation.

Contents

  1. SEARCH ENGINES AS POINTS OF CONTROL
    1. A New Hope?
    2. The Intermediaries Strike Back
      1. The New Intermediaries
      2. Search Engine Bias
  2. What Is Wrong With Search Engine Manipulation?
  3. Why Can’t Non-Regulatory Alternatives Solve The Problem?
    1. Market Discipline
    2. The Technological Fix: Personalized Search
  4. Potential Obstacles To Search Engine Regulation
    1. Will the First Amendment Bar Effective Regulation?
    2. Balancing Secrecy and Transparency
  5. Conclusion: Toward Regulation Of Search Engine Bias

tl;dr → Regulation is indicated. Heavy regulation, a fortiori is indicated. Yet these are entities are “publishers” and First Amendment rights appertain to them. This effectively blocks their regulation. Many intricate, advanced and creative analogies have been tried, to construe search engine serivces as “not a publisher.” But to no avail. And yet “we must at least try;” maybe someone will figure out how to do it.
and → <quote>The question, then, is whether a regulatory framework, either by statute or under the common law, could be crafted as to minimize these risks while preventing improper behavior by search engines.</quote>

Commencing with the frame…
“My God, I thought, Google knows what our culture wants!” attributed to John Battelle’s boosterist paean of a decade ago.
John Battelle, The Search: How Google And Its Rivals Rewrote The Rules Of Business And Transformed Our Culture; Penguin Random House; 2005-09-06; 336 pages; ASIN:1591841410; Kindle: $14, paper: $0.10+SHT.


Calo 2014

The case for a federal robotics commission; Ryan Calo; In Their Blog; 2014-09-15.
Ryan Calo,
Assistant Professor, University of Washington School of Law

tl;dr → There outta be a law.  Robots are like cars; Cars have laws. Robots are just as dangerous, only more so.
and → A new freestanding Federal Robot Commission (FTC) is warranted; made of the “best and the brightest.” Then, only then, will we be safe. These are perilous times of the new and the dangerous.

Outline

  • Introduction
  • Law & Robotics
    • Driverless Cars
    • Drones
    • Finance Algorithms
    • Cognitive Radio
    • Surgical Robots
  • FRC (Federal Robot Commission): A Thought Experiment
  • Objections
    • How are robots different from computers
    • Answer: robots have a body, they act on “reality.”
      <many-words>the difference between a computer and a robot has largely to do with the latter’s embodiment.</many-words>
  • Conclusion

Tutt 2016

Andrew Tutt; An FDA for Algorithms; In 69 Administrative Law Review 83 (2017); 2016-03-15 → 2017-04-20; 41 pages; ssrn:2747994

Abstract

[545 words; his point, and he does have one… An application of a precautionary principle is indiciated; these are dangerous machines run by dangerous people.]

The rise of increasingly complex algorithms calls for critical thought about how best to prevent, deter, and compensate for the harms that they cause. This paper argues that the criminal law and tort regulatory systems will prove no match for the difficult regulatory puzzles algorithms pose. Algorithmic regulation will require federal uniformity, expert judgment, political independence, and pre-market review to prevent – without stifling innovation – the introduction of unacceptably dangerous algorithms into the market. This paper proposes that a new specialist regulatory agency should be created to regulate algorithmic safety. An FDA for algorithms.

Such a federal consumer protection agency should have three powers. First, it should have the power to organize and classify algorithms into regulatory categories by their design, complexity, and potential for harm (in both ordinary use and through misuse). Second, it should have the power to prevent the introduction of algorithms into the market until their safety and efficacy has been proven through evidence-based pre-market trials. Third, the agency should have broad authority to impose disclosure requirements and usage restrictions to prevent algorithms’ harmful misuse.

To explain why a federal agency will be necessary, this paper proceeds in three parts. First, it explains the diversity of algorithms that already exist and that are soon to come. In the future many algorithms will be “trained,” not “designed.” That means that the operation of many algorithms will be opaque and difficult to predict in border cases, and responsibility for their harms will be diffuse and difficult to assign. Moreover, although “designed” algorithms already play important roles in many life-or-death situations (from emergency landings to automated braking systems), increasingly “trained” algorithms will be deployed in these mission-critical applications.

Second, this paper explains why other possible regulatory schemes – such as state tort and criminal law or regulation through subject-matter regulatory agencies – will not be as desirable as the creation of a centralized federal regulatory agency for the administration of algorithms as a category. For consumers, tort and criminal law are unlikely to efficiently counter the harms from algorithms. Harms traceable to algorithms may frequently be diffuse and difficult to detect. Human responsibility and liability for such harms will be difficult to establish. And narrowly tailored usage restrictions may be difficult to enforce through indirect regulation. For innovators, the availability of federal preemption from local and ex-post liability is likely to be desired.

Third, this paper explains that the concerns driving the regulation of food, drugs, and cosmetics closely resemble the concerns that should drive the regulation of algorithms. With respect to the operation of many drugs, the precise mechanisms by which they produce their benefits and harms are not well understood. The same will soon be true of many of the most important (and potentially dangerous) future algorithms. Drawing on lessons from the fitful growth and development of the FDA, the paper proposes that the FDA’s regulatory scheme is an appropriate model from which to design an agency charged with algorithmic regulation.

The paper closes by emphasizing the need to think proactively about the potential dangers algorithms pose. The United States created the FDA and expanded its regulatory reach only after several serious tragedies revealed its necessity. If we fail to anticipate the trajectory of modern algorithmic technology, history may repeat itself.

Web of Things (WoT), Architecture, Thing Description, Scripting API | W3C

Documents

Web of Things (WoT) Architecture, 2017-09-14.
The “building blocks”

  1. WoT Thing Description
  2. WoT Scripting API
  3. WoT Binding Templates.

Web of Things (WoT) Thing Description, 2017-09-14.
Describes the metadata and interfaces of Things.

Web of Things (WoT) Scripting API, 2017-09-14.
Operates on Things characterized by Properties, Actions and Events.

Mentions

  • JSON-LD

Separately noted.

 

Toward a Fourth Law of Robotics: Preserving Attribution, Responsibility, and Explainability in an Algorithmic Society | Pasquale

Frank A. Pasquale III; Toward a Fourth Law of Robotics: Preserving Attribution, Responsibility, and Explainability in an Algorithmic Society; Ohio State Law Journal, Vol. 78, 2017, U of Maryland Legal Studies Research Paper No. 2017-21; 2017-07-14; 13 pages; ssrn:3002546.

tl;dr → A comment for Balkin. To wit:
  1. Balkin should have supplied more context; such correction is supplied herewith.
  2. More expansive supervision is indicated; such expansion is supplied herewith.
  3. Another law is warranted; not a trinity, but perfection plus one more

Four Laws, here and previous:

  1. machine operators are always responsible for their machines.
  2. businesses are always responsible for their operators.
  3. machines must not pollute.
  4. A [machine] must always indicate the identity of its creator, controller, or owner.

Love the erudition; but this is just like planes, trains & automobiles.

Separately noted.

As IBM Ramps Up Its AI-Powered Advertising, Can Watson Crack the Code of Digital Marketing? | Ad Week

As IBM Ramps Up Its AI-Powered Advertising, Can Watson Crack the Code of Digital Marketing?; ; In Ad Week (Advertising Week); 2017-09-24.
Teaser: Acquisition of The Weather Company fuels a new division

tl;dr → Watson (a service bureau, AI-as-a-Service) is open for business.

Mentions

The 4 pillars of Watson Advertising.
  1. Targeting, Audience construction & activation.
  2. Optimization, Bidding & buying.
  3. Advertising, Synthesis of copy and creative.
  4. Planning, Campaign planning for media buying.

Separately noted.

 

Zuckerberg’s Preposterous Defense of Facebook – NYT

Zuckerberg’s Preposterous Defense of Facebook; Zeynep Tufekci; In New York Times (NYT); 2017-09-29.

Zeynep Tufekci
  • serial tweeter, @zeynep
  • associate professor, School of Information and Library Science, University of North Carolina
  • Twitter and Tear Gas: The Power and Fragility of Networked Protest; Yale University Press; 2017-05-18; 360 pages; ASIN:0300215126: Kindle: $13, paper: $16+SHT

tl;dr → Facebook as embodied in the person of Mark Zuckerberg is wrong; Facebook is bad; Mark Zuckerberg is bad.

Occasion

Who

  • Mark Zuckerberg, is chief executive, Facebook.
  • Donald J. Trump, is a serial tweetist.
    He is not the President of the United States, but plays one on TV
    He did stay at a Holiday Inn Express last night.
    He has been remanded to the National Adult Day Care Facility etc.
    To wit:

    It’s a shame the White House has become an adult day care center. Someone obviously missed their shift this morning.

    — Senator Bob Corker (@SenBobCorker) 2017-10-08