Tip:
Highlight text to annotate it
X
FEMALE SPEAKER: Welcome.
Welcome to Moscow.
JEFF JARVIS: Thank you so much.
MALE SPEAKER: [SPEAKING RUSSIAN]
JEFF JARVIS: --part of the event.
So I want to talk to you this [INAUDIBLE]
about publicness and privacy.
And they're not mutually exclusive.
They're not at war with each other.
But one depends upon the other.
As someone once said, publicness and privacy are
like hot and cold, or wet and dry.
It's a continuum.
They need each other.
But privacy has many protectors.
And I believe publicness has too few.
I wrote a book called "Public Parts" really about
publicness.
But today I'll start talking about privacy.
Our modern concept of privacy is a relatively recent concept
in society.
Some say it was born in England with the invention of
the hall and the back stairs, and rooms that you could lock
off to be by yourself.
What you hear about privacy has often been related to new
technologies, and the introduction of those new
technologies which leads to change and disruption and
fear, and then resistance to that change.
Go back to the Gutenberg press, some of the original
authors feared having their thoughts set down permanently
and spread widely.
Jonathan Swift said at the time that a book of verses
kept in a drawer and shown to a few was like a ***, but
once shown to people through the press, it was like a
common *** anyone could buy for two crown.
The idea of being that public was also a bit new.
The first serious discussion of the legal right to privacy
did not occur in the United States until the year 1890.
I was surprised at how late that was.
But the cause of it was the invention of a technology, the
Kodak camera.
The fact that a camera could leave its studio and go out in
the street and take a picture of you anywhere, and that
picture could appear in the mass press that was growing up
frightened people.
It freaked them out.
It creeped them out, to use the words we
use today about privacy.
In fact, if you look at The New York Times in the 1890's,
you'll find a lot of stories about, and I quote, "fiendish
Kodakers lying in wait".
The Kodak brand became a verb to mean to take a picture.
There was a story about a young rich man who
horsewhipped a Kodaker for taking his picture in public.
The President of the United States, Teddy Roosevelt,
banned Kodaking in Washington parks for a time.
So what happens?
Right now, we don't fear cameras.
We pose for them.
We have them all over.
We take pictures everywhere.
What happened was a natural process of society adjusting
its norms. We renegotiated what was right and wrong about
taking a picture, and where you take them, and
what you do with them.
And we became comfortable with this new technology.
Now since then, obviously, other technologies have caused
similar fears.
Microphones, little microphones, video cameras,
video cameras everywhere.
These things each led to lesser panics, moral panics
even, about privacy.
In Germany, they had a privacy [UNINTELLIGIBLE]
many other technologies to worry about, radio chips and
locator beacons, and DNA ID, and all kinds of other things.
But it is well and important to worry about privacy.
Privacy does need protections.
But we can't regulate life to the worst case alone.
If we see this magnificent new internet coming in and we do
is worry about the worst that could happen, we'll never
build the best that can happen.
I think other countries need to look to Russia, where my
friend Yuri Milner points out that there is an open view
toward development and toward investment, and that's leading
to a great deal of development there and
growth through the internet.
You have to think about the internet in that way as a
whole, because it is an important tool.
My concern about privacy regulation is that if we worry
about just privacy, we may miss the unintended
consequences that come from prematurely
regulating the net.
And that regulation is often aimed by legacy institutions,
by incumbent institutions, at trying to forestall change.
And I fear that those who would regulate the internet
are trying to define our future in the terms of our
past. We don't know what this future looks like yet.
I'll come back to that idea in a few moments.
Let me talk for a few minutes about the unintended
consequences that can come from regulating the internet
for a good end.
We see many attempts to regulate the internet now,
under the guise of privacy, piracy,
***, security, decency.
Even President Sarkozy of France would ask us to
regulate the internet in the name of civility.
But in all of these cases, I have to ask why
regulate the internet?
Is the internet broken?
It's operating much like it has for 15 years, since the
web came along.
But we find ourselves in a case where I think that these
institutions are worried about the effect and the disruption
of the internet on their power.
We'll get back to that as well.
So on these unintended consequences, just in privacy.
I'm going to give you a few examples.
You go to Germany, where Street View
caused a major fuss.
And one of the heads of privacy and consumer
protection in Germany urged Germans to petition Google to
have the images of their homes blurred on Street View.
My friends in Germany on Twitter started calling their
country Blurmany, as a result.
I don't know if that joke translates.
They also came up with a new word there, the
[? verpixelosrecht ?], the right to be pixelated.
Now it might sound good that you can tell Google not to
take a picture of a building on a street.
But on the other hand, it sounds rather silly to me.
And the worst part of it is that you've pressured Google
into not taking pictures of public
views from public places.
When you do that, you diminish what is public.
And we the public own what is public.
When you do that, you affect the free
speech rights of others.
If I tell you you can't take a picture of a public street,
maybe of the mayor walking into a *** den, well I've
affected your free speech and your right to monitor what
happens in public.
We had a case in United States of the police in Chicago,
Illinois arguing that they had a right to not have themselves
recorded doing their public duty in public.
I find that offensive in an open and public society.
We have to be able to maintain the value of what is public.
The other problem becomes, as is the case in Germany, whose
right it is to [INAUDIBLE]?
Is it the residents?
[INAUDIBLE]
one resident objected, one resident didn't.
Is it the owner of the building?
[INAUDIBLE]
architect of the building [INAUDIBLE], isn't it?
That's one example.
Another one in the US is what we call COPPA, Children's
Online Privacy Protection Act, which sounds like certainly a
very important thing to do for privacy.
And indeed it is.
But is has had many unintended consequences.
It says that children under 13, basically you can't use
information online about them, unless parents have gone
through rather considerable efforts to approve this.
Well, there's a few unintended consequences.
One is that we've learned that we've taught our children to
lie about their ages, which usually we didn't do until
later in life.
Dana Boyd, a researcher at New York University, did some
research and found that in her sample, children age 12, under
this cut-off, more than half of them have Facebook pages.
But what was really important was that 3/4 of those pages
were started with the help of their parents.
So this is a regulation that intended to give parents power
the parents not only didn't want to use, but flouted and
helped their children do what was in fact illegal.
But the main problem I have with this regulation is that
companies are frightened to serve children online.
I know myself, because I started some children's sites
when I began my work on the internet in the early '90s.
And I think what the result really of COPPA has been that
children are the worst served sector of society online.
So regulation that comes out of a good heart and a good
intention, if not studied, can have unintended consequences.
In the EU right now there's a lot of efforts to add very
strong privacy regulation from Viviane Reding, the Vice
President of the European Commission, and I'm troubled
by some of what she proposes.
For example, she proposes a right to be forgotten.
And that sounds good, but as soon as you and I interact
with a piece of data, then who owns that data?
Is it me or is it you?
And if you take a picture of me at a party on the street
and put it up, but I tell you you must take it down, well
I've now affected your rights of free speech.
This is a negotiation that we have to have that isn't as
simple as saying that one can have one's life erased online.
Viviane Reding also argues for what she
calls privacy by default.
Well, if we have privacy by default as the law of the
land, we would not have such public by default services as
Twitter and Flickr, the photo service.
That's a problem.
Elsewhere in the world, Australia and Canada are
looking to filter all content on the internet just to get to
child ***.
And of course, that is a very, very important issue, but we
have plenty of laws and regulations related to that.
We have enforcement related to that.
To set the precedent of filtering all content on the
internet to get to that one thing, that one
crime, affects us all.
And it reveals a basic problem here is that when you try to
regulate the technology versus the behavior, you have the
unintended consequence of affecting many behaviors.
Let me say that again.
If we regulate technology, regulate the entire internet
to get to one behavior we don't like, we also regulate
all other possible behaviors that could be there.
We shouldn't do that.
We should regulate the behavior.
That's what laws do.
Finally, in the US and in the EU we have a lot of discussion
about Do Not Track regulation.
This also sounds good.
But the truth is that advertisers have long sold the
fact that you read something to advertisers--
I'm sorry, media companies have sold it to advertisers.
And I fear that if it became the norm that you cannot place
a cookie on a web page to track that user, to know who
that user is from session a session-- not by name, but
just that it's the same user going along, then I fear that
we could gravely affect the sustainability, the business
sustainability of media.
And as a result, we could get less news, less content, less
free content, more pay walls.
These are just a few examples of the kinds of regulations
that are being proposed around privacy.
Again, there's other regulation
around piracy and content.
In the US, we had a big fight to beat down the laws called
SOPA and PIPA.
And it really was a war between
Hollywood and Silicon Valley.
And Silicon Valley, I'm glad to say, won.
But that's just one of many, many, many wars that are on to
protect the net.
I'll come back to that idea as well.
My problem here is that we're dealing with rather
nonspecific fears.
Privacy, it turns out, is very difficult to define.
It's rather an empty vessel word into which people put
their fears.
As one author said, privacy means everything and nothing.
I did a lot of work in my book to try to define privacy and
found it very, very difficult.
In the end, I came to this idea, that privacy is an ethic
of knowing someone else's information.
If I know something about you, you told me something, it is
now public to that extent.
What happens to that information, now that you've
told me, is up to me.
It rests on my shoulders.
So I have to decide whether or not it's in your benefit or to
your detriment to share that information.
I have to decide why you told me.
Did you want it shared?
Would it be good for you or bad for you if it was shared?
What was the context of you sharing it?
All of that decision rests on me, because privacy is an
ethic of knowing your information.
Publicness is an ethic of sharing your information and
the good that might come of that.
Now let me be very clear here.
No one should ever force you to share information, No one
should force you to share your private thought.
But we have to think now that there are benefits to sharing.
I talked about my prostate cancer on my blog.
That meant that I shared with the world the fact that my
*** was malfunctioning.
What possible good could come from that?
Well, I found that I had
friends and other people who gave me a lot of information
and support.
As a result of that, I inspired other men to get
tested for prostate cancer.
If you're over the age of 40 or 50, I ask
you to do the same.
It was my decision and my freedom to be able to share
that information and to have good come from it.
No one forced me to.
No one should force me to.
But if I hadn't shared it, if I hadn't gotten one of those
men to get tested, what would their fate be?
That's a question we need to ask.
And I think we have to realize that this discussion is so
much about privacy, but it also needs to be about
publicness.
845 million have joined Facebook.
And of course, many millions more have joined
[UNINTELLIGIBLE].
And the reason they're all there is to share.
Sharing is a good thing, it's what we want to do in society,
it's how we connect with each other.
And I think we have to protect that.
And to do that, I think we have to argue for the benefits
of sharing, and there are many.
Sharing creates relationships.
It Improve relationships.
It builds trust. It reduces stigmas.
Publicness was a weapon in the hands of homosexuals, gays and
lesbians, to fight back the bigots who had forced them
into a closet.
Publicness ends the idea of a stranger, because we can meet
each other online.
It brings out the wisdom of the crowd, witness Wikipedia.
It organizes us.
It also, very importantly, enables collaboration.
If a company and a government are public with their work and
what they do, that opens the door to people
collaborating with them.
Look at Occupy Wall Street and what occurred there.
People gathered around a simple hash tag on Twitter and
decided what this idea was.
And it spread around the world, and it
has a lasting impact.
I can tell you nothing you couldn't tell me, and I wish I
could hear from you and were there to ask questions about
how the internet had an impact in your last election and in
the protests that could occur.
Of course, we have the Arab Spring.
The Arab Spring was not caused by Twitter or Facebook, but
those tools enabled people to find each other.
The internet is a tool that enables people to find, form
and [UNINTELLIGIBLE], and that is precious.
So I want to end here, and talk about the necessity, I
think, to leave that freedom wide open.
We're at the very beginning of this change.
We tend to think of the change we're undergoing is very fast.
I now come to think that the change we're
undergoing is very slow.
I wrote a piece about that for Google's Think quarterly.
Elizabeth Eisenstein, who is a key scholar on Gutenberg, says
that the book did not really take on its own form for 50
years after the invention of the press.
The impact on society was not fully felt for a century.
In that sense, we're at about the year 1472.
And a columnist for the Observer in London, John
Naughton, asked us to imagine that we're on a bridge in that
year in Mainz, where the press was invented, and we ask
citizens of Mainz whether they think Gutenberg's invention
will cause the disruption of the Catholic Church, fuel the
Reformation, spark the scientific revolution, change
our definition of education and thus, childhood.
And I would add, change our notion of nations.
Nah, you'd say.
Ridiculous.
It's just an invention, just a new technology.
I think the internet is that big.
I think the change is that vast and that profound.
And so I would ask us to be aware of trying to regulate
and in fact define the internet today, when we should
be leaving it open to the future.
So what I would like to see is a discussion about the
principles of an open and public society.
And I want to list the ones that I proposed in my book.
I'd love to have a discussion about others.
It's certainly by no means the right list. But I believe we
have a right to connect to the internet, maybe not as far as
Finland and declaring that a constitutional right.
But if Mubarak cuts off your connection to the internet, is
that not a violation of your human rights?
Let's get an agreement to that.
That leads to a right to speak.
That in turn leads to a right to
assemble and act as publics.
Privacy, as I said, I think is an ethic of knowing.
Publicness is an ethic of sharing.
I think we must see that the information from our
institutions should become open by default
and secret by necessity.
Especially government, not individuals.
Companies would be wise to be open.
But governments must become open by default.
Now, they are secret by default and open by force.
The government is doing the public's business and must do
so in public.
We have the tools to do that.
Now we must insist upon that now.
Three more principles.
First, what's public is a public good.
When you reduce the definition of public, you affect us all.
Next, all bits are created equal.
If one bit on the internet can't get from end to end,
edge to edge because a government or a company has
detoured it or slowed it or stopped it, then no bits can
be presumed to be free.
And finally, the net must remain open and distributed.
That is the architecture of the internet.
The very architecture means that no one can control it.
No one can claim sovereignty over it.
And if anyone does, then it's not the internet anymore.
It's not free anymore
So I think we're here today to talk about privacy in society,
and the internet as well.
And I would urge us to not just talk about privacy and
about fears, but also talk about publicness and benefits
and what we want to defend.
So thank you very much.
And I now will join the conversation.
FEMALE SPEAKER: Thanks a lot.
Thanks a lot, Jeff.
[APPLAUSE]