Tip:
Highlight text to annotate it
X
Thank you very much.
So a few years ago I was visiting Walt Disney World
with my family.
And I have a friend who works in Imagineering, and he
decided to give me a tour of his latest project.
It was a mobile game for children around Epcot Center
using the theme of Kim Possible, one of their
characters.
Now if you're familiar with Disney World, you'll know that
Epcot's World Showcase is the least kid-friendly part of
Disney World.
It's a series of national pavilions where you can buy
local handicrafts and, of course, ***.
And they had a business problem.
They needed to figure out how to keep children occupied
while their parents were drinking and shopping their
way around this miniature simulacrum of our planet.
So they made this Kim Possible game.
They had a series of riddles delivered to special mobile
phones they gave to the kids that they would solve by
seeking out clues inside each of the national pavilions.
So when I first heard about this, I thought it was going
to be an awkward afternoon, because I assumed that this
was going to be creepy.
Because this is Disney, after all.
This is the company that fingerprints your children
when they visit the theme park.
And they don't even call it fingerprinting.
They say they're capturing the unique biometric
characteristics of your fingertip.
But it turned out that it was very clever indeed and not at
all creepy.
These special phones that they had didn't send out a radio
beacon telling Disney headquarters where children
and their parents were.
Instead, the players used the sensors in the phone to detect
radio beacons being emitted by the solutions to the riddles.
So not only was this computationally less intensive
and much more elegant and reliable than keeping track of
where everyone was at all times, it
was also less creepy.
Because there is something off about a game where the game
tokens themselves follow children around and tell a
large corporation where they are, what they're doing, and
what their parents are up to as well.
And there is something elegant and sweet about using human
beings as sensors instead of things to be sensed.
After all, we're good at sensing things.
We're not perfect.
We lack the diligence and the capacity to sustain attention
that our technological offspring have. But we are
good at making patterns.
And we are good at making decisions.
That's why Galaxy Zoo, the astronomy project that asks
random internet users without any special training to
classify objects that were detected by the Hubble, gets
better results out of people than they do out of machine
learning algorithms.
And it's also why Google's page rank algorithm works.
They measure the links that people make between pages.
And that's proven, so far, more effective than parsing
out the pages themselves.
It's a machine-human collaboration that lets humans
do what humans are good at-- making decisions--
and lets computers do what computers are good at--
counting stuff--
in this case the decisions people have made.
Now there are plenty of reasons why you might want to
count a person instead of asking a
person to count a thing.
But technical elegance and efficiency are rarely on the
minds of system designers when it comes to gathering and
analyzing data on human activity.
More often these systems are designed under the rubric of
taking something valuable from users--
their personal information--
and giving them something valuable in return--
the service that they're using.
This is the so-called "privacy bargain." And it's the
ideological basis for much of the commercial activity in the
modern world and nearly all of the activity on the internet.
But as many people have noted, this is a curiously one-sided
and non-negotiated sort of bargain.
To understand what I mean by this, please listen carefully
to the following statement.
Listen carefully.
By listening to this talk, you agree on behalf of your
employer to release me from all obligations and waivers,
from any and all non-negotiated agreements,
licenses, terms of service, shrink wrap, click wrap,
browse wrap, confidentiality, non-disclosure, non-compete,
and acceptable-use policies that I've entered into with
your employer, its partners, its licensors, its agents and
assigns in perpetuity, without prejudice to my ongoing rights
and privileges.
You further represent that by listening to this talk, you
have the authority to release me from any of the
aforementioned on behalf of your employer.
Now this is the kind of agreement that the privacy
bargain runs on.
And, in fact, it's actually much worse on the internet.
To be better analogous to the internet's style of
bargaining, instead of reading you this agreement here from
the podium, I would print it on a very small card in
eight-point gray on black type, and I would hide it
beneath your seats.
And then I would add a sentence to the end of it that
went like this.
This agreement is subject to change without notice.
Now there are lots of players in the funny agreement game,
but if there's one company that's going to bring home the
gold for America in the 2024 Cyber Olympiad, it's Facebook.
There's a company that does everything it can to make it
difficult, if not impossible, to figure out the nature of
the bargain you're making with them.
It's a company that designed its service to act as a giant
Skinner box aimed at teaching you to systematically
undervalue your privacy and the trades you make with it.
This is why they have these frequent opt-out privacy
policy changes where any preferences that the users
have expressed about the bargain they're willing to
make, the information they're willing to trade for the
service they get, are obliterated by system-wide
changes that always create an enormous public outcry about
this new design.
And then there's a public climb down from Facebook that
resets the status quo to just like before, only a little
less private and then a revision to their privacy
preferences to make them more opaque, and murkier, and
harder to understand.
The first half of this pattern, asking for a lot and
settling for a little less, is familiar to anyone who's ever
parented a toddler.
Daddy, I'd like a fudge cake, a bicycle, and a trip to
Disney World.
I'm sorry.
That's out of the question.
All right then.
I'll settle for the bicycle and the fudge cake.
Now the second half--
making preferences more complicated and more
confusing--
is familiar to anyone who's ever been conned at a bar bet
or lost money on a craps table because, of course, that's how
those sorts of games work.
Rather than presenting their propositions as simple
transactions--
you give me this much privacy and I'll give
you this much service--
the new settings complexify things, the way all of those
fancy lines on a craps table do or the way that a short con
bar hustler does.
Adding complexity to the probabilities in the payout,
so you can't readily compare them in your head and figure
out when the odds are in your favor.
What's more, Facebook and the services like it, by design or
by accident, have all the mechanics of a rigged game.
You get some stimulus, positive social attention,
from the people you care about when you make a disclosure.
But not every time.
Only intermittently.
It's the same mechanism that experimental psychologists use
to get rats to press food pellet levers long after
they're satiated.
And it's the same mechanism that slot design machiners use
when they program the schedule of payouts.
And it's why so many of the scratch and win lotto cards
award a small prize, usually another ticket, so that you
play again.
Maybe Facebook and companies like it hit on this idea by
accident in the early days.
But today, when they employ enormous brain trusts of
social scientists and psychologists on their staff,
it would be terribly credulous to call this continued
refinement of the strategy anything but deliberate.
But even when the privacy bargain is struck in good
faith by players on all sides, it's hard to price your
privacy well.
As a species, we struggle enormously with pricing the
long-term cost of a present-day disclosure.
An example of this is my grade three
teacher when I was eight.
He had a baby, he and his wife, in a hospital.
And shortly afterwards were visited by a marketing
company, a representative from a marketer, who said, I have a
basket of presents for your new child.
It's got nappies.
It's got onesies, baby grows.
It's got formula, and wipes, all sorts of things that a new
parent could need.
And all I need from you is your child's name and date of
birth and your home address, so we can continue our
relationship as your child ages.
And he gave it to him.
And his child died.
And every year, year after year on his dead child's
birthday, he received birthday cards and presents for the
child who died.
It's hard to be good at privacy, because how often do
we give our personal information to marketers in a
maternity ward for child who dies afterwards?
How much practice do we get when we're pricing the
potential cost of giving away what seems like a harmless bit
of information on a very happy day?
So these problems, where consequences are separated by
long gap of time and space from a decision are the kinds
of problems that we, generally speaking,
never get good at solving.
So back in the old days, before digital cameras, the
average American family was shooting two
rolls of film a year.
They'd shoot one at Christmas and birthdays and one on the
family vacation.
And they send them to the lab, and the
photos would come back.
And some would be good, and some would be terrible.
And you could go through them and say I'll keep that one,
and I'll chuck that one out, and I'll keep that one, and
I'll chuck that one out.
But you could never remember what it was did all those
months ago to make the good one so you
could do more of it.
And you never remember what mistake you made all those
months ago to make the bad one so that you could avoid making
those mistakes again.
The mere act of closing the feedback loop, putting the
picture on the back of the camera right
after we've taken it--
Are you holding up a sign for me?
You're holding up the wrong side.
There we go.
Thank you.
Thank you.
The mere act of taking these pictures and being able to see
it right after you've taken it made us into such good
photographers that we now actually buy software like
Hipstamatic to make our pictures look worse so that
they will seem authentic to us.
Because the photos we take now have such good composition and
look so good that they seem to have been taken by studio
photographers and are somehow posed.
Now there are lots of problems that share these
characteristics--
smoking, obesity.
It takes a long time after eating a cupcake to understand
the consequences of eating that cupcake.
If it showed up on our hips the moment we put it in our
mouth, that would really change our behaviors.
Now how do we beat these compulsive pathological
behaviors where they exist in the wild?
Well, we do things like education.
Right now in the world of privacy, mostly that consists
of running around patronizingly wagging our
fingers at children telling them to stop using Facebook.
We do things with regulations and laws.
And regulators have tried to solve the privacy problem in
lots of ways.
Some say they'll make following Do Not
Track headers mandatory.
Others say that they'll established this "you own your
data" regime.
But I'm skeptical of both of these efforts.
Do Not Track would be trivial to ignore.
There's no built-in enforcement.
And detecting violations would be very hard, nothing short of
a full-on colonoscopic investigation of a company's
servers would tell you whether or not they're
obeying Do Not Track.
You own your own data, I think, would be a technical
nightmare to implement.
And besides, it will likely require firms to gather more
and retain more information about their users so they can
figure out who their users are so that they know
whose data they own.
Besides, creating the ownership rights and public
facts, like dates of birth or telephone numbers, creates far
more problems than it solves.
15 years into the internet copyright wars, haven't we
learned enough to know that pretending that copying is the
kind of thing we can control in the age of the internet
generates all sorts of terrible problems?
Have we learned nothing from the disastrous train wreck
that is the music industry?
We have lots of ways to talk about valuable things without
creating yet another property-like right.
We have the idea of interest in things and people.
I have a 3 and 1/2 year old daughter.
I'm going to be flying on to London right after our panel
to get her from the day care.
And I have a powerful interest in her but so does the state
and so does my wife, her mother, and so do her
grandparents, and so do her school, so do her friends, and
so on and so on.
And especially so does she, but none of these are
described as a property interest. If we're to regulate
information, let us find information like regimes to
use in the regulation.
And while we're trying to get that right, let's remember
that there's another lever in the policy fight, not just
laws and not just norms. But we have code, as well.
Technology design.
So think about how we used code to solve another noxious
internet problem--
pop-up ads.
Do you remember pop-up ads?
Do you remember you'd visit a web page and your browser--
what programmers call ironically "your user agent"--
would, at the behest of someone else, some marketing
creep, blanket your desktop in tiny Windows.
And each time you closed one, it would spawn another.
They'd play music.
They'd run away from your cursor.
They'd spawn at one pixel squared and play techno music.
What happened to pop-ups?
We didn't make them illegal.
No.
What happened was Mozilla happened to pop-ups.
Mozilla finally shipped a browser that was a user
agent-- a browser that acted in the interest of its users
and blocked pop-ups.
And that made pop-ups disappear.
And it was funny thing, because they didn't just
disappear for Mozilla users.
When a small tech-savvy minority of users stopped
seeing pop-ups, companies stopped demanding them.
Competing browser vendors started adding pop-up blockers
to their browsers, because they worried about losing
users to the browser that acted like a user agent.
Now there was a short-lived arms race to put pop-ups on
the desktops of people who'd sought out browsers that
blocked them, but sending pop-ups in an era where
pop-ups are ubiquitous is very different from sending pop-ups
in an era in which they're perceived as having lost in
the marketplace.
As a tactic, sending pop-ups became so odious that even
spammers, pornographers, and *** bootleggers more or
less abandoned it.
I think that cookie managers could be the new pop-up
blocker, and boy do we ever need it.
The privacy bargain is a myth, because it says that people
consent to being tracked when they don't block cookies.
Have you ever tried to selectively block cookies?
I invite you to try it.
Just create a new user for your browser, open the privacy
preferences, and check off, "Ask me each time" under.
"When should I accept a cookie?" See how long you
last. I made it about 10 minutes.
The number of decisions you have to make to load even a
simple page when you block cookies, complex and subtle
decisions, is nearly infinite.
Over and over again, you're asked to assess whether you
want to accept some cryptically named cookie some
of which reload every few seconds.
And there's no way to know a priori, which of these cookies
are useful to you, which keep you logged into a commenting
system, for example, and which ones are just following you
around and giving you nothing in the bargain.
Now this is in part the fault of people like me who operate
websites that set cookies.
But it's also the fault of browser design.
If browsers made it as easy to manage cookies as it is to
manage bookmarks or logins or form data or
past searches, users--
some users, some appreciable fraction of users--
would manage their cookies.
And we should all want that because the privacy bargain
shouldn't consist of people who can't figure out how to
use their browsers so we get to suck as much information as
we want out of their machines and call it an even trade.
It should be people gave informed
consent to this trade.
When we give users the power to treat their private data as
valuable, companies that offer better services--
maybe your company--
will be able to get more data from their users than
companies that offer bad services, perhaps like your
competitors.
Because users will be able to choose.
I think a world where users can control their data
emissions would be a very different one to the one that
we live in today.
Once, in the early days of ad-supported websites,
advertisers would accept the analytics generated by the
site operators.
But not anymore.
Now if you visit a website with 15 ads, you will get 15
trackers, one for each of the ad for advertiser or broker.
Some pages for major services will have over 100 trackers.
How do we get to a model where there's less pressure on the
people who operate websites to gather the user's information?
Well, we can make it possible for users to
decline to give it.
And just as a few years ago advertisers decided entirely
to stop demanding pop-ups because people block them on
site, they might also start to temper their tracking efforts
and their demand that everyone gets to track every user.
We are in a shooting war between the analytics people
and the users.
But the users aren't armed.
Browser vendors have it in their power to arm users.
And there's room for new commercial operators in that
very boring and stolid state.
Money on the table for anyone who wants to get into the
business of arming the rebels instead of just the empire.
Now, of course, a cookie manager will only allow you to
manage real cookies, not evercookies, or sneaky cookies
that use HTML5 tricks or Flash to hide from their users.
But we have a pretty cool business
model for those cookies.
Security researchers, like Ashkan Soltani, uncover their
existence and publish scholarly papers about them.
Plaintiff-side class-action lawyers read
those technical papers.
Courts beat the hell out of the companies that use them
and transfer enormous sums of money to the
plaintiff-side lawyers.
And the evercookies go away.
But, of course, some of you don't
work on desktop software.
Some of you are interested in the cloud.
Now I like the cloud.
I liked the cloud when it was just something we doodled on
the white boards.
And there's plenty of stuff that I like about clouds and
that I want from them.
Zero knowledge, forward secret, encrypted commodity
storage, and virtual machines, repositories that my apps can
use to throw little chunks of state data so that I can sync
up across my devices and within my work group.
But I worry an awful lot about how vulnerable the cloud is to
a denial-of-service attack.
Not the kind of denial-of-service attack that
you get from anonymous and its lower-bit Ion Cannon, but the
kind of denial-of-service attack that you get from
Senator Joe Lieberman's staffers and their telephones.
Only Amazon and Joe Lieberman know for sure what the
discussions that resulted in Amazon shutting down WikiLeaks
looked like, but the in-firm speculation I've heard from
people at Amazon and on the Hill is that someone in the
senator's staff rang up Amazon and said, nice
cloud you've got here.
It would be a shame if something were
to happen to it.
Because when you operate a server farm running virtual
machines and commodity storage that are balanced across racks
and racks of hardware, it's not easy for the police to
come in and seize a single customer's data as part of
their evidentiary process.
We already hear today about search warrants that end up
being served through the seizure of entire racks rather
than an individual blade.
Because law enforcement can't guarantee that it has the
right hardware to do forensics on the blade without mounting
it on the rack that it shipped in.
This kind of law enforcement blackout is
just getting started.
And it's going to put cloud operators in a very unhappy
position where they have to reject any customer who has
any chance of generating a search warrant or getting
embroiled in legal discovery.
Or they're going to have to figure out how to build
systems that are robust through some combination of
zero knowledge, not knowing what they have, having nothing
stored in the clear and nothing that anyone could
usefully seize and good compliance software that can
readily package up one customer's threads, storage,
and other data in a neat forensically valid,
trustworthy component.
Now we carry elaborate sensor packages on our
persons at all time.
And our computers generate information about our lives as
we work and live.
And it's not evil or crazy to want to figure out how to use
these things to improve our lives with
products and services.
It's in the spirit of self-knowledge the words on
the Grecian temple, Know Thyself.
And some companies have really figured out how to balance
this stuff.
Amazon, for example, lets me create a list that reflects my
theory of how to sell the things that
it puts in its inventory.
I can create Cory's list of indispensable reading or
Cory's favorite kitchen gadgets, or Cory's complete
toolkit for making your own tin foil hat.
I don't charge them anything for this.
And if my theory is right, and I sell some stuff with my
list, they pay me a commission into my affiliate account.
That treats me like a sensor, like someone who knows
something about the world.
Meanwhile, if the users who buy one book on my list buy
another, Amazon automatically starts cross recommending
them, treating those users as data as well as sensors.
And that's a good balance.
But for so long as our tools are designed to allow the
privacy-invading equivalent of pop-up ads, we'll get the
modern equivalent of pop-up ads.
And the balance will be harder and harder to find.
If you believe in the privacy bargain, you have to give the
public negotiating leverage.
We need browsers, computers, and mobile devices that
meaningfully manage our data emissions.
We need to give the users bargaining chips at the table.
For example, there's a great Android fork called
CyanogenMod, where one hacker's implemented a feature
I enjoy very much.
It allows your phone to lie to its apps.
Why would you want to do that?
Well, today if you download an app from the marketplace--
I have a connect the dots game that my toddler likes on long
plane rides--
the Android will tell you to install this app you have to
give the company that is selling it to you permission
to track where you are at all times.
It's a take it or leave it offer.
Now people who know about game theory in economics will tell
you that you don't get optimal outcomes from take
it or leave it offers.
But if you use the mod, you can say by all means install
the app and tell it that I'm going to give it information
about where I am.
But when it asks, make it up.
Invent a random location, and send it to the app.
If that sort of thing becomes standard in our devices and
systems, it will change the dynamic from a smorgasbord
where users' personal information is available on
the all-you-can-eat basis to a genuine marketplace, where you
have to offer something compelling to a user in order
to get the user's real, negotiated, uncoerced
permission to access her data in return.
It's nice to talk about the laws that will give users the
right to their data.
And it's easy to wag our finger at Facebook users and
talk about how foolish and naive they are to give so much
of their privacy to such a manifestly unsuitable
guardian, but none of that stuff gets us anywhere until
we have the tools that allow users to actually control
their data.
Thank you.
[APPLAUSE]