Rewriting History: Changing the Archived Web from the Present | Lerner, Kohno, Roesner

Ada Lerner (Wellsley), Tadayoshi Kohno (Washington), Franziska Roesner (Washington); Rewriting History: Changing the Archived Web from the Present;; In Proceedings of the ACM Conference on Computer and Communications Security (CCS). Dallas, Texas, USA, 2017-10-30→2017-11-03; 18 pages.

tl;dr → Web “content” is now an executable program, assembled only in the client. The JavaScript HTML5 people are very proud of this. Programs create dynamic URLs.  e.g. advertising.  Mashups are not preservable, as such.

Abstract

The Internet Archive’s Wayback Machine is the largest modern web archive, preserving web content since 1996. We discover and analyze several vulnerabilities in how the Wayback Machine archives data, and then leverage these vulnerabilities to create what are to our knowledge the first attacks against a user’s view of the archived web. Our vulnerabilities are enabled by the unique interaction between the Wayback Machine’s archives, other websites, and a user’s browser, and attackers do not need to compromise the archives in order to compromise users’ views of a stored page. We demonstrate the effectiveness of our attacks through proof-of-concept implementations. Then, we conduct a measurement study to quantify the prevalence of vulnerabilities in the archive. Finally, we explore defenses which might be deployed by archives, website publishers, and the users of archives, and present the prototype of a defense for clients of the Wayback Machine, ArchiveWatcher.

How Smartphones Hijack Our Minds | WSJ

How Smartphones Hijack Our Minds; Nicholas Carr; In The Wall Street Journal (WSJ); 2017-10-06 (paywalled).
Teaser: Research suggests that as the brain grows dependent on phone technology, the intellect weakens

tl;dr → <quote>[people] aren’t very good at distinguishing the knowledge we keep in our heads from the information we find on our phones or computers. </quote>

Books

  • The Shallows: What the Internet Is Doing to Our Brains, W. W. Norton, 2011-06-08, 404 pages, ASIN:0393339750: Kindle: $9, paper: $10+SHT.
  • Utopia Is Creepy, and Other Provocations, W. W. Norton; 2016-09-06, 384 pages, ASIN:0393254542: kindle: 10, paper: $8+SHT.
  • and [many] other books
    …in the boosterist and anthologized thinkpiece longread blogpost genres e.g.

    • The Glass Cage: How Our Computers Are Changing Us, W. W. Norton, 2015-09-08, 288 pages, ASIN:0393351637: Kindle: $9, paper: $6+SHT.
    • IT Matter? Information Technology and the Corrosion of Competitive Advantage, Harvard Business Review Press, 2004-04, 208 pages, ASIN:1591394449, Kindle: $20, paper: $0.01+SHT.

 

Mentions

  • “available cognitive capacity”
  • “fluid intelligence”
  • “brain drain” (a technical term, attributed to Ward et al.)
  • “supernormal stimulus”
  • “data is memory without history”, attributed to Cynthia Ozick.
  • the “Google effect,” strictly, pertains to information retrieval.

Exemplars

…they are bad…
  • Apple, iPhone
  • Facebook
  • Google
  • Samsung [Android]

Who

  • Maarten Bos, staff, Disney.
  • Kristen Duke, staff, University of California, San Diego (UCSD).
  • Ayelet Gneezy, staff, University of California, San Diego (UCSD).
  • William James, boffo, quoted circa 1892.
    Expertise: psychology, philosophy.
    Honorific: pioneering .
  • Cynthia Ozick, self.
    Trade: scrivener, dissent.
  • Betsy Sparrow, staff, Columbia University.
    Expertise: psychology.
  • Adrian Ward, professor, marketing professor, University of Texas at Austin (UTA)
    Expertise: psychology, cognitive psychology
  • Daniel Wegner, Harvard.
    deceased.
    Expertise: memory

Referenced

  • Many Unlock Events Per Day; video segment; ABC News; WHEN?.
    …Where more Americans get their news than from any other source [grammar police be damned!]
  • Some Survey, Gallup, 2015.
    tl;dr → <quote>Over 50% “can’t image” life without a cellphone.</quote>
  • Adrian Ward, et al. A Study. That. Shows. In Journal of Experimental Psychology. 2015. pubmed:26121498
  • Some Authors. Another Study. That. Shows. In Journal of Computer-Mediated Communication, 2015.
  • Adrian Ward (U.T. Austin), Kristen Duke, Ayelet Gneezy (UCSD), Maarten Bos (Disney). Study. That. Shows. 2015.
  • Adrian Ward (UTA) et al.More Study. That. Shows. In Journal of the Association for Consumer Research. 2017-04. preprint. DOI:10.1086/691462.
  • Some Authors (University of Southern Maine). Another Study. That. Shows. In Social Psychology. psycnet:2014-52302-001
  • More Authors. Yet Another Study. That. Shows. In Applied Cognitive Psychology. 2017-04. another study. DOI:10.1002/acp.3323.
    tl;dr → N=160 & WEIRD (students) at the University of Arkansas at Monticello.
  • Even More Authors. Even More Study. That. Shows. In Labour Economics; 2016.
  • More Authors. More Study. That Shows. In Journal of Social and Personal Relationships. 2013. paywall. DOI:10.1177/0265407512453827.
    tl;dr → N=192, WIERD (students), University of Essex in the U.K.
  • Betsy Sparrow (Columbia), Daniel Wegner (Harvard), et al. Authors. Yet Another Study. That. Shows. In Science (Magazine). 2011. paywall.
  • The Internet has become the external hard drive for our memories; Staff; In Scientific American; WHEN?

Previously

In The Wall Street Journal (WSJ)…

Artificial Intelligence (AI) Now, Report 2017

AI Now 2017 Report; Artificial Intelligence Now; 2017; 37 pages.

Authors

  • Alex Campolo, New York University
  • Madelyn Sanfilippo, New York University
  • Meredith Whittaker, Google Open Research, New York University, and AI Now
  • Kate Crawford, Microsoft Research, New York University, and AI Now

Editors

  • Andrew Selbst, Yale Information Society Project and Data & Society
  • Solon Barocas, Cornell University

tl;dr

Diagnostic: Whereas AI, as a situated practice, is dangerous as are its practitioners.
Nostrum: the following mitigations are indicated: transparency, supervision, funding.

Recommendations

Ten items.

MUST
Algorithms must be

  • open,
  • tested,
  • supervised.
SHOULD
  • Open data, towards reproducibility.
  • More hiring, as specified.
  • Codes & certifications on practitioners [should]
    • exist
    • contain professional peril to licenciate toward their salubrious behavior.
MORE
Funding
  • [normative] policy design.
  • [empirical] activism in support of policy design.
  • [control] in support of the mission to mitagate [harms, by presumption].
Hiring
Generally, of non-technical persons.
Specifically of the enumerated classes of persons:

  • women
  • minorities
  • other
Standards
Specifically [of algorithms], towards

  • supervision,
  • audit,
  • compliance status.

Table of Contents

  • Recommendations
  • Executive Summary
  • Introduction
  • Labor and Automation
    • Research by Sector and Task
    • AI and the Nature of Work
    • Inequality and Redistribution
  • Bias and Inclusion
    • Where Bias Comes From
    • The AI Field is Not Diverse
    • Recent Developments in Bias Research
    • Emerging Strategies to Address Bias
  • Rights and Liberties
    • Population Registries and Computing Power
    • Corporate and Government Entanglements
    • AI and the Legal System
    • AI and Privacy
  • Ethics and Governance
    • Ethical Concerns in AI
    • AI Reflects Its Origins
    • Ethical Codes
    • Challenges and Concerns Going Forward
  • Conclusion
Promotions

The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google | Scott Galloway

Scott Galloway; The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google; Portfolio; 2017-10-02; 320 pages; ASIN:B06WP982HX: Kindle: $15, paper: $19+SHT.

tl;dr → Yet-Another-Jeremiad (YAJ®), An indictment of Amazon, Apple, Facebook, Google and Netflix. They bad.
and → Everyone’s penning these for the fall book release cycle.  This here Youtoobbist has one too.
and → <quote>And he reveals how you can apply the lessons of their ascent to your own business or career.</quote>

Mentions

  • disruption
  • Something about how Google is the godhead.
  • Game of Thrones, a work of fiction
    • the Iron Throne, a plot device
  • Kardashians
  • Catholics
  • Russia, Russians
  • China
  • Something about how “government” should break up Amazon.
  • New York University (NYU), in (um) New York)
  • Palo Alto, in Cailifornia
  • Hamburg, in Europe

Pantheon

  • Jeff Bezos, CEO, Amazon
  • Margrethe Vestager, the commissioner on competition, European Union (EU).

Exemplars

  • Alphabet
  • Amazon
  • Apple
  • Discover (card)
  • Facebook
  • Google
  • iTunes, of Apple
  • Netflix
  • Pandora
  • Snapchat
  • Swatch
  • WhatsApp, of Facebook
  • YouTube

Promotions

Turns Out Algorithms Are Racist | New Republic

Turns Out Algorithms Are Racist; Navneet Alang; In The New Republic; 2017-08-31.
Teaser: Artificial intelligence is becoming a greater part of our daily lives, but the technologies can contain dangerous biases and assumptions—and we’re only beginning to understand the consequences.

tl;dr → Cites a Wired essay in the first ‘graph.  Hangs the tale off of that.
and → then s/Sexist/Racist/g; we saw what you did there.

Original Sources

Machines Taught by Photos Learn a Sexist View of Women;; In Wired; 2017-08-21.

Mentions

Referenced

Previously

In The New Republic

Gu, Dolan-Gavitt, Garg (NYU) built an invisible backdoor to hack AI’s decisions | Quartz

Researchers built an invisible backdoor to hack AI’s decisions; Dave Gershgorn; In Quartz; 2017-08-24.

tl;dr → The computer’s semiotics works For The Man, which may not be you.  They trained neural networks against signals and undocumented overrides.  The lusers thought it was trained against only the honest signals inuring to their benefit. They were wrong, to their detriment.
thus →  Know your supply chain. Who are you doing business with? It was ever thus: Surviving on a Diet of Poisoned Fruit.

Original Sources

Tianyu Gu, Brendan Dolan-Gavitt, Siddharth Garg; BadNets: Identifying Vulnerabilities in the Machine Learning Model Supply Chain; 2017-08-22; N pages; arXiv:1708.06733v1.

Mentions

  • New York University (NYU)
  • “secret” (though now promoted to the unwashed here in Quartz)
    “backdoor” (a metaphor towards entry and access)
    into software.
  • Artificial Intelligence (AI)
  • cloud provider
  • self-driving car
  • <quote>trigger (like a Post-It Note)</quote>
  • Marvin Minsky
    • “the 1950s”
  • Facebook

Who

  • Brendan Dolan-Gavitt, professor, New York University (NYU)

Abstract

Deep learning-based techniques have achieved state-of-the-art performance on a wide variety of recognition and classification tasks. However, these networks are typically computationally expensive to train, requiring weeks of computation on many GPUs; as a result, many users outsource the training procedure to the cloud or rely on pre-trained models that are then fine-tuned for a specific task. In this paper we show that outsourced training introduces new security risks: an adversary can create a maliciously trained network (a backdoored neural network, or a BadNet) that has state-of-the-art performance on the user’s training and validation samples, but behaves badly on specific attacker-chosen inputs. We first explore the properties of BadNets in a toy example, by creating a backdoored handwritten digit classifier. Next, we demonstrate backdoors in a more realistic scenario by creating a U.S. street sign classifier that identifies stop signs as speed limits when a special sticker is added to the stop sign; we then show in addition that the backdoor in our US street sign detector can persist even if the network is later retrained for another task and cause a drop in accuracy of {25}\% on average when the backdoor trigger is present. These results demonstrate that backdoors in neural networks are both powerful and—because the behavior of neural networks is difficult to explicate—stealthy. This work provides motivation for further research into techniques for verifying and inspecting neural networks, just as we have developed tools for verifying and debugging software.

BlueBorne | Aramis Labs

Ben Seri, Gregory Vishnepolsky (Aramis Labs); BlueBorne; a whitepaper; 2017-09-11; 36 pages; Document Cloud.
Teaser: The dangers of Bluetooth implementations: Unveiling zero day
vulnerabilities and security flaws in modern Bluetooth stacks.

tl;dr → they found a bug.

Mentioned

  • billions of devices (bullions of duhvicuhs, buuulions of duuhhhvicuuuhs)
  • clickless
  • “Patch now, if you haven’t already”

Promotions

Verizon Wants to Build an Advertising Juggernaut. It Needs Your Data First | WSJ

Verizon Wants to Build an Advertising Juggernaut. It Needs Your Data First; ; In The Wall Street Journal (WSJ); 2017-09-05.
Teaser: The company offers concert tickets and other rewards in exchange for customers’ personal information

tl;dr → No information; just FUD, name dropping & pull quoting.. <claimed><quote>Verizon hopes the information will help it gain advertising revenue to offset sluggish growth in its cellular business.</quote></claimed>

Mentions

  • Diego Scotti, chief marketing officer, Verizon.
  • Verizon Selects
  • Oath
    • AOL
    • Yahoo
  • Declined to comment.
    • Google
    • Facebook

University of Washington DNA Sequencing Security Study | University of Washington

Frequently-Asked Questions (FAQ)
Computer Security and Privacy in DNA Sequencing
Paul G. Allen School of Computer Science & Engineering, University of Washington

tl;dr → it’s a bug report on fqzcomp, fzcomp-4.6, wrapped in some lab work, wrapped in scare piece wrapped in an academic paper. It mentions DNA, people are made of DNA, YOU are made of DNA.

  • In the future, everyone will be famous for fifteen minutes.
    • They did it for the lulz, and the whuffie.
    • They did it for the FUD.
  • They are frontrunning the presntation of the paper at the conference site in Vancouver, CA
  • But there is nothing to worry about.
    • Really.
    • No, Really.
    • And they’ve already contacted the project sponsors with their work product.
However

Today’s theoretical demonstrations are tomorrow’s practice.

Original Sources

Ney, Koscher, Organick, Creze, Kohno; Computer Security, Privacy, and DNA Sequencing: Compromising Computers with Synthesized DNA, Privacy Leaks, and More; In  Proceedings of the USENIX Security Symposium; 2017-08-16; 15 pages.

Concept

  • They created DNA with particular patterns.
  • They used buffer overflows in C & C++ programs.
  • FASTQ, a data format.
  • /dev/tcp accessed via bash

Quotes

  • <quote>Although used broadly by biology researchers, many of these programs are written by small research groups and thus have likely not been subjected to serious adversarial pressure. </quote>
  • <quote><snip/> copied fqzcomp from SourceForge and inserted a vulnerability into version 4.6 of its source code; a function that processes and compresses DNA reads individually, using a fixed-size buffer to store the compressed data.<quote>
  • <quote>Our second exploit attempt uses an obscure feature of bash, which exposes virtual /dev/tcp devices that create TCP/IP connections. We use this feature to redirect stdin and stdout of /bin/sh to a TCP/IP socket, which connects back to our server.<quote>

Moral

The “research” coders do not validate their inputs; they use whatever computer tools are handy for their purpose. Their purpose is to publish papers in their field of study. Their code works just well enough; it is MVP for an MPU. Those “researchers” who do validate their inputs, who do test their code, who do read CVE notices, who do remediate latent vulnerabilities aren’t researchers at all. They are drone coders in an on-time-under-budget, time & materials IT shop. “We” need such people and such skill is a valued trade craft by which to make an honorable living.  But such activity is Not New. It is not The Research.

Surprise, Echo Owners, You’re Now Part of Amazon’s Random Social Network | Gizmondo

Surprise, Echo Owners, You’re Now Part of Amazon’s Random Social Network; Kashmir Hill; In Gizmondo; 2017-07-19.

Mentions

  • Amazon Echo
  • Amazon Alexa
  • Google Search
  • Google Voice Search
  • Alexa&Echo becomes a 1980s-style answering machine.
  • Internet of [Consumer] Things
  • late-binding software updates can “change behavior”
  • something about ex-boyfriends.
  • <handwringing>context collapse</handwringing>
  • <handwringing>A hacker could find out…</handwringing>
  • Denegotiating (Opt Out) requires calling Amazon Customer Service.

Time Line

2014
first release 2014.
2017-05
  • force-placed software update
  • features
    • Drop In
    • Alexa Calling and Messaging

Referenced

In rough order of appearance