Tip:
Highlight text to annotate it
X
Introduction by Martin Abgregu, Ford Foundation's director
of rights
and governance.
SEWELL CHAN: Hi everyone. It's a pleasure to be here. We're not going to do introductions
because you've got one in your program and we want to get straight into the panel. I'm
Sewell Chan with the New York Times and we hope to, by the end of this panel, get you
clapping and singing [LAUGHTER] but it's...I have to say it's a hard act to follow, that
group.
ANDREW MCLAUGHLIN: We decided to go with the Gregorian chant style of the panel...
CHAN: So I want to start by throwing out a very broad question to Sir Tim at the opposite
end of the line. Sir Tim, you basically invented the web, if not the internet. Did you perceive
that its use would turn so quickly to social uses?--the networking that's been behind so
much of the political organizing of the last year and the also, potential repressive possibilities?
SIR TIM BERNERS-LEE: Well, initially, twenty something years ago, the idea of it was that
it should be a medium for any communication at all. It wasn't designed as something for
research or specifically for social or commercial usage. And it was clear that any attempt to
sort of, draw a line and say 'yeah, the web's good for this but it's not good for that'
would mean that the hypertextual link structure of the web, for example, would be hopeless.
So you have to be usable for anything. It's been in a way, you could think, I think of
the web, not so much as pages connected, as humanity connected. So it is people then.
Since it started when somebody made a link...when there was a link made, it was a person making
the link. And when there was a link followed, it was a person following the link. So in
a way, it's essentially social. And whenever we create a new system out there, the things
that we do are as much social...the design of the system, the design of the web is as
much a social design as it is a technical design.
CHAN: Rebecca, much has been made of the Arab Spring and the uses of technology by political
organizers, dissidents, human rights activists...but much has also been made of the use of technology
by repressive states that are combating this wave of activism. Can you say generally, whether
technology has been a force for democratic good or for repressive bad when it come to
the Arab Spring?
REBECCA MACKINNON: Well, the technology is a force for everything. And so I think we
need to get beyond this debate of is the internet going to help the good guys more or the bad
guys more. I think the important question is how do we ensure that the internet evolves
in a manner that is conducive to free expression--that is conducive to dissent--so that it can continue
to be used by activists to hold power holders to account, whoever they are? And so certainly,
I think there is no question that the internet played a powerful role. The internet did not
cause the Arab Spring on its own but it was certainly used by a lot of people to bring
about change. And there is a new generation of activists, particularly young women, that
have been working not just in the last couple of years, but for nearly the past decade as
part of the anti-torture movement and so on, digging up information, posting it online,
who wouldn't have been able to participate in this way had it not been for the internet.
So yes, there is a power. But there's also the flip side. And we're already seeing actually,
in both Egypt and Tunisia, in Egypt the surveillance capacity that the regime held before the revolution
is still being used. So you had activists, in March, who stormed the state security headquarters
and found their files that were full of their email transcripts, their cell phone text messaging
records...all these kinds of things...all that technology is still being used by current
military transitional government. And so, that's a big question in the new Egypt; is
that going to really change? Is there going to be any more accountability in terms of
how people's personal information is used by the authorities and what role that the
network operators, and the companies that make up the infrastructure...what role they
are playing? Similarly, in Tunisia, in May, censorship resumed. Not as heavily as before,
but the transitional government has started blocking certain webpages that they feel might
incite people to violence or that are pornographic and so on. And there's a real debate going
on about who should have the power to decided this and how do you hold them accountable?
And the issue is to, we in the west do not have good answers for people in Tunisia and
Egypt about how do you create the right balance between legitimate law enforcement, legitimate
child protection interests and ensuring that civil liberties are protected in these new
network spaces. We ourselves our having huge fights about these issues and I know Sir Tim
and others would argue, in many cases, we're not getting things right at all. And just
one other point too, since I do a lot of work on China, the most well known aspect of Chinese
internet censorship is what we call The Great Firewall of China--the fact that Facebook
is blocked, and Twitter is blocked and lots of overseas websites are blocked. But really
the most insidious part of Chinese censorship is the way in which Chinese companies, and
other companies operating inside China, are expected and compelled to act as an extension
of state power and are actually doing much...most of the censorship and surveillance on the
government's behalf. And the way in which these private intermediary companies, upon
which everybody is depending increasingly for their civic life, are acting as an extension--really
an opaque extension--of political power, is kind of a lesson for the entire world.
CHAN: Andrew, you did some research into the Mubarak regime's use of the so-called, kill-switch
I believe to shut down internet access during the Arab Spring. Could you talk about that
and the role of private companies and intermediaries in that process?
MCLAUGHLIN: Yeah, absolutely. So, taking it from where Rebecca left it...so what Egypt
had done with its communications infrastructure was centralize it very highly; and this is
pretty normal in much of the world, that you have a former monopoly, state owned monopoly,
that even in the internet age occupies, kind of, a dominant position in the communications
infrastructure. So Egypt is at this amazing little juncture in the world where Asia meets
Africa meets Europe. And an extraordinary amount of the world's fiber goes across Egypt.
So the connections that go from Asia to Europe, and now increasingly from Europe down to Africa
through the Red Sea, land at Suez, go over land to Alexandria and then go beyond. So
in a way, Egypt is one of the best wired countries in the world. However, the government allowed
only one country to tap into those wires and to hold all the paths to communication from
Egypt to the outside world. So that meant that there was Telecom Egypt, and then a couple
of mobile phone companies that were the dominant connectivity providers and some smaller ISPs,
all of whom got their connectivity through Telecom Egypt, with one exception which there
is a research network called Internet2, which had its own connection which went to a university
in Europe. Anyway the point of all this is that, it turned out to be pretty easy, when
the time came, for the government to terminate external communications and then shut down
internal network-to-network communications as well; and they did both of those things.
And one of the interesting things that happened was, when they switched the networks back
on, after about six days, the first thing that happened was, lots of Egyptians were
delighted, turned on the phones, and without even making a phone call or an SMS message,
a number of activists were grabbed immediately because their phones geo-located them. And
this is the thing everybody should know: is that, when you're phone is on, even if you're
not using it, it's pinging a cell phone tower nearby you a couple times a minute; and that's
how the network know where to send phone calls if somebody wants to call you or where to
send data. And most telecom companies, including in the United States, keep records of where
you are all the time. And actually, a really fun thing that somebody in Germany did, a
member of the German...a Green party member of the German parliament, Malte Spitz, used
the privacy laws of Germany to get from Deutsche Telekom, his mobile phone company, T-Mobile,
a full record of everything they knew about his physical location for about a year and
half and they he handed it over to Spiegel; and Spiegel built a map that showed his location
all twenty-four hours of the day for about eighteen months.
DANNY O'BRIEN: And you can fast-forward through it...
MCLAUGHLIN: And you can fast-forward, scroll, you can look at it and he fed in other things
like data from where he had been from his calendar and so forth. But anyway, for every
one of you in this room, that data exists.
O'BRIEN: For six months, at least, in Europe and...
MCLAUGHLIN: Exactly, and...
CHAN: We'll be collecting that data at the end of this panel. [LAUGHTER]
O'BRIEN: Well we know where you are now. [LAUGHTER]
MCLAUGHLIN: Here's the important thing to think about, I think, to sort of go up a level.
So, in these technology-good vs. technology-bad and how do you maximize the good and minimize
the bad debates, there are two variables. One is policy and institutions and the other
is technology; and they're both very flexible, right? So you can have lots of different policies
and laws and so forth, and lots of different types of technologies. Technology is very
flexible actually. We can build software to do many, many, many different things. It's
not too much of a limiting factor in what you can do. And so, what we see right now
is a series of interlocking axes of see-sawness where, for example, the policies will evolve
to protect your privacy in Europe, and then the intelligence agencies will go and try
to acquire...the state security agencies in Europe will try to acquire the ISPs to keep
logs of your data as a technical matter...keep log files that they can go and get access
to. In this country we've seen the spread of two technologies in particular that have
really alarmed the security and law enforcement agencies; one is Skype and the other is BlackBerry
Enterprise Services. And the reason is because each of these are services where the company
has built its communications technology so that the endpoints of a communication can
encrypt from end to end without even the company being able to decrypt. So Skype, dynamically
generates encryption keys for each session. So if Tim and I are--Sir Tim and I--are talking
via Skype, it's being encrypted from his laptop to my laptop and Skype, the company that provides
the software, doesn't even have the ability to decrypt it if the intelligence agencies
order them to. Similarly, BlackBerry makes an enterprise version of its BlackBerry which
says that 'if I'm IBM, or the White House--where I used to work--you can buy an enterprise
server and we set our own keys and encrypt the messages to our users without BlackBerry,
the company, being able to do so. So that's quite alarming to our intelligence agencies
who, traditionally, have been able to go get court orders and then go serve them on companies
and get the plain text back; they can't do that anymore. So what they're saying is, 'okay,
the technology is evolved; now we need to look at--and this is what the New York Times
reported a couple months ago--now we need to look at requiring those companies to reengineer
the technology to make it possible. We could talk about lots of different examples but
these are the two axes, the policies and the rules and critically the institutions. Because,
for example, in Libya and Egypt and Tunisia, what we saw were institutions that were bent
towards surveillance. You can similarly build institutions which are bent and are institutionally
incented in other ways. But, the technology can only go so far. Like, you can come up
with ways to protect but as long as you have, as Rebecca pointed out, companies providing
services, they are susceptible to the rule and policies and institutions that they live
under. And, so these are the two axes that I think are constantly in tension.
CHAN: That gets to question I wanted to ask Elisa. Traditional human rights activism has
been oriented, it seems, around the state: state actors, state repression of civil liberties,
of civil rights, of human rights. When it comes to internet law and governance, there
is this...the power of non-state actors...private companies...whether internet service providers,
telecommunications companies, mobile companies, the companies that own the servers and house
them and obviously the giant search engines...I mean, it seems like a whole new field. Have
human rights activists been adept and nimble enough to adapting themselves to this?
ELISA MASSIMINO: Well, in some ways I think they have. But, we're talking today about
how do we move beyond conventions and with respect to the private actors, in many ways,
we're before conventions. I mean, we're really...while I think the traditional human rights movement
has always had an eye on private actors--whether they're multinational companies or terrorist
groups. The existing, you know, kind of the traditional framework, the universal declaration...all
of this is geared towards states. But when so much of the public square is now controlled
by, owned by, policed by, private actors, it require a whole different set of skills.
And I think one of the challenges for us, I think, as human rights activists, is what
Andrew was saying, not only does the infrastructure, the technology, the setup of the technology
profoundly impact the ability of the people to exercise their rights, but also it's the
legal and commercial architecture around that technology infrastructure that will set the
rules of the road going forward. So, I mean, that's in part why some of us on this panel
are engaged in this multi-stakeholder initiative called the Global Network Initiative to try
to bring, in a voluntary framework, companies together with activists and experts and academics
and investors to agree on rules of the road and then figure out how companies--many of
them in the technology sector--who see themselves very much, you know, as on the side of the
good guys--grappling with how to implement this. I mean we talked about how difficult
it is to move to implementation of these fundamental human rights treaties. But to have private
companies, who are motivated by primarily...under the rules for profit, figure out how do they
be on the side of the good guys in this struggle; and there are sides. Companies...you know,
Google has a foreign policy. They might not articulate it that way but they definitely
do. And for a human rights activist, it's exercising a little bit different muscles.
With the Global Network Initiative...and I think the jury is really far out still on
whether this is going to produce real change on the ground. But, we are at least seeing
companies...educating companies about what points in the decision chain of commercial
activity do they have to consider things like free expression for users--privacy for users?
This idea that Mubarak had a kill-switch...activists should have that too for their own personal
data. And people in the technology sector, I find it's very energizing, because they're
always thinking about the next big thing. And that's fantastic as a new tool, a new
frontier on which to exercise our fundamental rights and freedoms. But we want them also
to be thinking about the next bad thing and how we can build the infrastructure to slant
it, as Andrew says, towards the realization of these rights.
CHAN: That point, the next bad thing, gets me to Danny. I wanted to... [LAUGHTER] When
you take a historical view, one thing that strikes me is that the proliferation of the
web and the internet as an organizing framework for social life across much of the planet--obviously
more so in the wealthy North than in the Global South...but, this proliferation has coincided
with this period of enhanced concern about terrorism. I mean it's almost the exact same
period that we're seeing the proliferation of the internet and on the other hand, the
growth of that national security state across the world. Can you examine that? Tell us what
that tension has meant for the development of the web.
O'BRIEN: Well, I one of the problems is that the War on Terrorism is sort of couched in
a way in this idea of asymmetry and the idea that it's a fight between centralized forces
against decentralized forces and if you frame the internet in that particular battle, it's
pretty clear what side it appears to be in. And in many ways, the internet had a good
run for ten years in which generally the news...it was described as very positive; it was seen
as a force for good. And then like so many other things that changed...the rhetoric changes.
And I think that the Arab Spring was a wakeup call for a lot of people about the capabilities
of these kinds of tools. But, I mean, it was certainly a wakeup call for repressive regimes
too. And at the same time we were dealing with this, sort of, spike in cases in the
Middle East, we were also incredibly aware in China, the crackdown that happened immediately
subsequent was one that unparalleled. It was stronger than during the Olympics. And what
we quickly watched was that it's always going to be, for the foreseeable future, a cat and
mouse game where both sides, reaches out to whatever the new technology is in an attempt
to control or limit the damage of the other.
CHAN: Rebecca, did you want to respond?
MACKINNON: Yeah, I just wanted to point out that there's one more factor in this whole,
kind of, freedom versus control balance that I think is really important and really relates
to the work of Sir Tim...is what we sometimes called the Digital Commons. And the fact that
you kind of have this digital civil society not only of activist groups who are using
the internet to bring about change and do things, and organize, but you have people
who are involved in building open technologies that are non proprietary in many cases or
at least, are based on open standards that can be modified that is really the basis for
a lot of the freedom that is possible and the openness that is possible on the internet.
And Sir Tim, when he invented HTML--the hypertext that makes the web possible--he didn't say
okay, I own this and everybody has to pay me to use it. He released it as a public good
which now people have built upon and we have the World Wide Web that we have today. And
so there's a whole community of developers and technologists--some of whom also build
commercial things with open standards. But many others who are dedicated to building
technologies that can be modified and are open so activists can kind of understand it
or for instance, depending on what operating system your cell phone uses, you smart phone
uses, for instance, you can either modify the code on it or you can't. And so these
are, not to get too technical...but there are a lot of issues related to again, how
technology is developed and what we're using, how open it is, and whether the broader public
can really understand what can and cannot be done with it and modify it to ensure that
it is compatible with our values and our aspirations.
BERNERS-LEE: Well, a lot of this...so the open source movement, obviously, came before
the web and I guess maybe a similar time to internet, and it's been obviously, a great
support. So it's been great at every stage of the development so that the internet forty
years ago and the web 20 years ago...there's always been an open source, several open source
implementation of things. It's been great. And on the flip side is, the open source movement
has always generated...benefitted hugely from being able to deploy across the web so those
work very nicely together although I think it's important also to have commercial things;
and it's not all open source. But when you look at...the web and email are separate things
that deployed across the web internet platform and I...even though email had been around
for a while and internet had been around for a while, 20 years after the internet being
produced, I could deploy the web without asking anybody; that was really neat. I didn't have
to ask for Port 80; that was only because I wanted a small number. Before, I used a
telephone number. So without anybody at all, that's a hugely valuable thing: that on an
internet platform you can build something like the Web without asking anybody. And similarly
the Web is designed so that you can build stuff on top of it whether it's social networking,
or open data or education health, or whatever it is, it's built on top of the Web without
having to ask anybody. And now we're going on to build more and more things, some of
those built using more and more layers of open software. And when this open software
happens, maybe the software is running on top of... an open source operation system.
But on top of that, all kinds of libraries and all kinds of things which have been produced
in the years to date which now form this tremendously rich, very high platform where the Web is
somewhere under there. All kinds of things are being built on top of the Web, and in
fact, data is built on top of the Web which is built on top of the Internet. And it's
a lot of fun to focus on building the next really cool thing. Yes, we should look at
what the next bad thing is but also, of course, we have to be very careful that we don't then
take the rest for granted. So I spend some of my time with one hat on, thinking about
the next really cool thing and coding it up in the taxi and some of my time thinking about
how to get people all together to make common platforms for the next cool thing and then
I have to spend a certain amount of my time like a tax, just making sure that the whole
thing doesn't fall apart because everything we've done, the whole stack, relies upon the
internet being this reliable, trustworthy resource. So it relies on, just as the days
of males going on horseback, interfering with the males in all the countries with... You
can do a lot of stuff but you don't interfere with the males because that starts...good
or bad, society relies on the males. So that with the internet, for all the stuff to work,
if you look at the internet and you wonder about using these tools and you realize that
as you click, somebody might be looking over your shoulder and it might be reported, it
nixes everything. If as you click, and you try to get information, you'll actually tend
to end up going, not to the Wikipedia article that you were going to, but you've ended up
with a pretty good article which tells you that you should buy a particular drug; in
other words, which is a form, a gentle form, of censorship. I prefer to show you my drug
companies, a text in the open one, that either of those two things, to me, it destroys the
whole thing. So in other words...so I am pretty passionate about just saying that if you're
going to do any monitoring on the internet or if you're going to do any censorship at
all it has to have huge checks and balances. And basically the answer is, no; you don't
do it at all, except for emergencies. If you've got an emergency, if you've got a terrorist...come
show me. And show me that you've got another agency looking at you and making sure that
you'll respect the openness of the internet.
MCLAUGHLIN: So, one of the big frustrations, to go back to what Elisa was saying and Elisa
and Danny, your exceptions in the human rights community world in a way...but one of the
big frustrations from nerd land has been that the traditional human rights organizations--the
Nerdistan, we used to call it [LAUGHTER]--
MASSIMINO: We visit there. We don't live there but we do visit.
MCLAUGHLIN: You do visit sometimes.
MACKINNON: And then sometimes we log onto Facebookistan.
O'BRIEN: I live in Nerdistan.
MCLAUGHLIN: And so one of the frustrations with the traditional human rights community
has been that what Tim has just identified, Sir Tim, just identified is this cluster of
issues, which from the perspective of the ability to vindicate human rights on the network
are critically important and at least speaking for myself, we found it incredibly frustrating
to find funders and organization to pay attention to things like net neutrality or these days,
even antitrust. So if information is power, and concentration of control of information
is a concentration of power, then competition policy becomes a critical human rights issue.
MASSIMINO: Absolutely.
MCLAUGHLIN: We've had lots of debates about media concentration in the broadcast world
but trying to get people to pay attention to the concentration of network facilities
under the umbrella of say, AT&T, and even more so as they propose this merger, or when
Comcast buys NBC...viewing these from the perspective of human rights has been, I think,
it's been slow to happen in the human rights community. And one of the questions that I
might throw back at you is like, so you know, it's great that human rights first paid attention
to the internet issues'; it's great that the Committee to Protect Journalists hired Danny
and said 'think about this stuff.' But when we tried to do activism around net neutrality--which
is the simple principle that the network, which is the low level, should not try to
control what happens at the higher levels--a very simple principle, it was basically...this
is an oversimplification, but a lot of people said that it's an esoteric fight between Silicon
Valley and Telecom companies. It's just like one big business against another.
CHAN: Should Google--which you worked for and it's currently undergoing some antitrust
examination...should that antitrust question be examined through this human rights framework?
Of course, absolutely. Google ought to be subject just like anybody...
MASSIMINO: I think that's a totally fair criticism of many human rights groups and civil rights
groups too. I mean I think, you know, it is this...this is in many ways...we're at this
moment in internet technology and its relationship with human rights as we were kind of sixty
years ago with the civil and political, economic, social and cultural rights thinking about
what's the infrastructure that will enable us to enforce these rights. And it takes a
while for people to...plus you have the added challenge that you know, understanding it...I
could throw this back at the technology community, you know, and say that it hasn't done a very
good job of explaining to people, not just us in the human rights community, but also
the broader public about why this is relevant. We have to build a constituency that understands
that this touched every part of everybody's life. And, we're doing it a little bit in
Egypt now where we're looking that the reform of the telecommunications law and talking
about what can we learn from the Egypt experience. And realizing that there actually is a...some
congruence between human rights interests and business interests. Ultimately, one thing
we have going for us at least on this issue--not always on other human rights issues--but in
the long term information technology companies...their business model has to rely on freedom of information.
I mean, in the short-term, yes, they can make compromises to get short term gains and market
access and stuff. But overall, you know, we are aiming towards the same place. But I think,
part of the challenge is to funders. Ford has been, kind of, a leading edge on this
in helping bring the human rights community together to kind of synthesize all of this
and expose everybody to these arguments that's way more accessible than a lot of these folks
are talking to each other which human rights groups are also guilty of doing.
CHAN: Can I press a little bit on this point as an outside observer? Is the human right
to information--if there is one--to free access to information or even to broadband, if you
will...is that in conflict with or in tension with the human right to privacy? Because on
the one hand, we want ever more access to information, ever more volumes of information
at ever faster speeds but on the other hand, this idea in Europe of the right to be forgotten
which we don't have in the United States, the right to purge myself from the corporations
or the phone companies that have my records and know my locations and know what I've purchased
and what my spending patterns are...are those rights if you...First of all, are they rights
and are they intentioned?
MASSIMINO: First of all, I think in fact, if you go back to the Universal Declaration
of Human Rights...I mean, there is recognized the right to free to exchange of information,
the right to privacy. And within the fundamental international bill of rights, there is a recognition
that sometimes these rights are going to come into conflict and there is kind of a...it
was anticipated in the wisdom of the drafters that there would have to be a balancing. I
think obviously they didn't anticipate the complexities that we now have to navigate
about information, for privacy especially, staying around for decades. But yes, they
are rights and this goes back to what Rebecca was saying, I think, and also Sir Tim, about
the sort of, governance issues around technology and human rights. We have to have a constituency
of users that's well enough informed and that has a forum for being activated, mobilized,
to have an impact on these issues. And when it's solely governments and companies setting
the rules of the road, the architecture will be very difficult to then slant in the direction
of realizing these rights of free expression and privacy. So, I mean the clock is really
ticking on that right now and it's a huge challenge to us as a community to step into
that.
MACKINNON: I mean, these issues are really intentioned and you have some child protection
groups and other groups who don't want Facebook to allow anonymity because they think it'll
encourage bullying and so on. Yet on the other hand, if people cannot be anonymous online,
you're going to get more and more people arrested for what they post on Facebook. And so, how
these companies choose to approach their policies is really important and when you have companies
like Facebook...Randi Zuckerberg, Mark Zuckerberg's sister who's also an executive, recently said
that anonymity should be eliminated from the internet because it only promotes bad behavior.
Again, this is the tension that plays out all the time and civil society, and particularly
the human rights movement, hasn't been very active in these discussions. And I know for
instance, Facebook, is reaching out to a lot of NGOS and is saying, 'we want to help you
use Facebook to advocate for you issue.' But are these NGOs pushing back and saying well
if we're going to work with you, if we're going to use Facebook as a major platform
and rely on it, we want to talk to you about your privacy policies. We want to talk to
you about your identification policies and whether people can use pseudonyms to protect
themselves. And, at the moment people aren't doing that.
CHAN: Tim, do you want to weigh in?
BERNERS-LEE: Well, that's a great...in a way the anonymity issues is a unique point and
it really is a tradeoff. So I think yes...and in fact, the last time I was here, there was
a really great discussion on the stage about that. And that made me think about that whole...And
I realized that we have these two rights: one is the right, when you're a whistleblower
in an oppressive regime or an oppressive company, to be able to anonymously blow the whistle
and you have a right to face your attacker, to know who it is. So WTC, The Web Consortium,
is an industry body and we have one person who uses a pseudonym--and we instituted a
special process-because they convinced us that 'okay, they had a reason for operating.
But otherwise, the process is the WTC...now you come to the table and you put all your
cards open on it. So I'm generally in favor of a lot of places where we are not anonymous
but I feel also there are some kinds of times especially in oppressive regimes--when who's
to say when they set up the system whether the next regime is going to be oppressive--when
you absolutely need it. So in the case of anonymity, I think we have to build social
systems. Say it's Facebook, for example, it may actually lead the way and show countries
how to follow by instituting, if you like, the courts which will allow people to have,
under some circumstances, to get a pseudonym--and where they abuse it, to have a pseudonym stripped
away, for example. However, when we say that this is tradeoff, that anonymity is a tradeoff,
it's one of the rare tradeoffs. So, net neutrality, for example, people attack it in lots of ways.
And one of the ways is you say, 'ah, well, it's a tradeoff.' But it's not. And neutrality
is one of those things where it's just a push to get---
MACKINNON: On the Facebook thing though, they have refused to create this process that would
allow exceptions and some of us have actually approached them about it; it's a real shame.
I mean, that's one way in which companies could be kind of leading the way and they're
not. I'm sorry. I interrupted you.
O'BRIAN: No, no...
CHAN: I'm sorry to interrupt but we want to have enough time for the audience so I'm going
to open the floor up but we can continue the discussion; don't be shy, this is an incredibly
distinguished panel. Is there someone passing around the microphone? Okay, and who wants
to start? If I could just ask please, direct your question to a speaker because we want
to get as many questions in as possible. Yes, the one in the black jacket.
FEMALE AUDIENCE MEMBER: I'm Monica.. I recently read this book by Eli Pariser called The Filter
Bubble and it really scared me very much. And basically his argument is that, you know,
everything that we're doing online is tracked, we talk a lot about government, but really,
by the companies themselves, and then they are translated into a marketing imperative.
So if I'm searching for human rights sitting in this room right now... on Google, we'll
both get different results on the search engine....based on your specific reality and my specific reality...
CHAN: What's the question, sorry.
FEMALE AUDIENCE MEMBER: So, the question is, is this something that we should be worrying
about? Is this real? Since technology changes so fast, is there a solution that wouldn't
just be appended? I mean I just want to hear your thoughts.
MCLAUGHLIN: Yeah, so it's well worth worrying about. Very simply, what Eli points out is
that if you're sitting in front of your Facebook feed right now, Facebook is choosing a subset
of all the things that you're friends are generating to show you. And they're choosing
that based not on what you most want to see; it's based on what they want you most to see--in
other words, maximizing the enterprise value of Facebook determines what they put in front
of you. And that's natural. They are for-profit companies with stockholders so they should
do that. But the problem is that Eli noticed, his friends are--he says anyway--50/50 right
and left. And he was getting only left leaning stuff in his feed and that's because Facebook
was noticing that he was clicking on the left stuff and so they were showing him left stuff.
And as Facebook people will tell you, when they roll out a new service like say, Photo
sharing, they will just stick lots photos in front of you to try to get you to generate
photos and put them in and so forth. So, what he says is that to the extent in which these
companies are controlling attention, then they are shaping the environment that you
live in, in essence. And even if you click on the option that say 'most recent,' you
might think that that's literally everything in chronological order; but it's not. They're
still applying selectors even to that. So, regulation, by the way, is clearly the wrong
way to think about this because there's absolutely no way, I can think of, to regulate how you
show information. But what it does call for is attention to some of these other issues
that we mentioned before; for example, in a world where there are lots of different
companies responsible for presenting information to you--you're relying on lots of different
actors--you're concerns about the biases in any one will be reduced. If Facebook is where
everybody goes to do everything social online, then that is a problem. To the extent that
Facebook opens their platform so that other people can build on top of it, you're less
concerned. To the extent that they exclude others from building on top, you're more concerned.
And so, the best solution, seems to me, is a healthy competitive environment with lots
of different actors and a rapidly changing technology. The antitrust laws are not very
well configured right now to try to achieve that but one of the things that I think is
fundamentally important is that we do use the few levers that have. For example, keeping
AT&T from massively re-monopolizing the communications infrastructure of the country would be a nice
place to start. And at the application level, by the way, it really calls for robust entrepreneurship
as an antidote in a Schumpeterian destructive environment to the threat of concentration.
Even so, we're depending a lot on the goodwill, the good common sense of the people running
these companies--the Mark Zuckerbergs and the Larrys and Surgeys of the world--to operate
their companies in broadly speaking, the public interest. One last little footnote is, Cass
Sunstein wrote a book about ten years ago called Republic.com which says, 'isn't it
going to be horrible when all you see online is people who agree with you and you structure
and configure your online world so that you're just seeing the stuff that you already know
you agree with?' And I think that is a persistent problem that we have no good solution to right
now, which is the kind of, echo chamber. There's a lot of evidence that we're debating and
arguing more but there's also a lot of evidence that we're getting more narrowly channeled.
BERNERS-LEE: One solution I suggested to some social networking folks is that there is a
concept of a stretch friend. Like when you're applying to colleges. [LAUGHTER] There's the...so
Facebook...normally they're safety friends. If Facebook will introduce you to somebody
who is already friends with lots of your other friends, so in other words, they will help
you make a boring set of people who all agree with each other.
CHAN: Tim, will you be my stretch friend? [LAUGHTER]
BERNERS-LEE: That would be too much of a stretch. [LAUGHTER] So a stretch friend, every now
and again, it says 'you know what, you've got this profile and I've got someone who's
got almost the same profile, but they live in Iraq. The other one is exactly the same
as you, they've got all the same likes and if everybody made one stretch friend a week,
or a stretch friend a month, we change the typology of the platform.
CHAN: I want to get in as many questions so quickly please.
FORD FOUNDATION PROGRAM OFFICER JENNY TOOMEY: Yes, so hi, this question is for Rebecca.
I think everything you said is very important about anonymity and I know that everyone in
here cares about protecting people's anonymity. I'm really concerned about what I see is an
increase in fake speech and corporate speech in the environment as well. And this is something
that Tim said before: if you don't think that you can have privacy in this space, you don't
engage. But if you don't think you're really engaging with people, you also don't engage.
So how do we deal with fake speech in this space?
MACKINNON: Well I mean, you know, this is of course a huge problem. I think a lot of
it has to do with the remedy to fake speech and bad speech is more speech. And some of
the ways in which fake corporate speech has been dealt with is that people have outed
it. And so I think having both individuals and groups who are both dedicated to indentifying,
'oh, this person who claims to be a person, is really a corporation,' is really important.
But I think to require that...to eliminate anonymity is not the solution either because
that hurts some people who really need anonymity and might get killed if they don't have it.
So I am not willing to trade the possibility of anonymity for the benefit of eliminating
corporate bad actors. I think we need other solutions to deal with the bad actor problem
that revolves around transparency and people having freedom of information to find out
what's really going on.
O'BRIEN: I think that one of the things that plays in here is what we talked about--the
sort of fragility of what's underlying the internet. And I think that often, when you
see a problem like that, it feels like there must be--because it was so easy to create
this environment of lots of people speaking...that there must be a technological solution to
fix for instance, hate speech. And in fact, the process of defending human rights is there
because actually, fighting hate speech is an incredibly complicated social problem that
requires heavy protection of some basic rights so people can go about controlling that kind
of speech in the environment of an open society. And the danger, I think, that always kicks
in when you're talking about something that's new like the internet--and you're talking
where a lot of constituencies don't necessarily understand all of these details--is that they
go for the simple tech fix. It's not the geeks. It's, people go, 'well look. We can build
a Great Firewall of China or surely there's some way that you can make this disappear.'
And by trying to do that tech fix without understanding the environment, brings the
whole thing tumbling down. As soon as you go off to something--whether it's terrorism
or hate speech--with a simple trick, it undermines the entire system that holds everything else
up.
CHAN: Gentleman over there.
MALE AUDIENCE MEMBER: I'd be very curious to hear from any of the panelists about your
thoughts of the role of "hacktivists" or the "loose coalition" known as Anonymous in challenging
the corporate dominance of the internet.
CHAN: Well, do you frame that as a human rights issue?
MALE AUDIENCE MEMBER: I'd be curious to see if there is a discourse of human rights that
includes people who act as activists, hacking on the internet.
BERNERS-LEE: My two cents on that is that yes...that phase when people were fighting
anonymity and putting data out and other people when trying to hack...but when there is a
fight between...the question of whether something should be public or not and whether information
should be discovered or not is up to the hacking skills of the individuals involved. Hello!
That is like resorting to jewels as a way to find out who's guilty. That is not the
way to run a restaurant. [LAUGHTER] So, no, the rule of law--though it should not be who's
the most successful...so when the hackers manage to bring down the corporates for being
smarter and managing to find a hole, no, that does not show that they're right. So I don't
have sympathy for that way of settling things at all.
MASSIMINO: I think that we have, in this country at least, whistleblower protection and discovery
rules and litigation and all that for a good reason. And I couldn't agree more that it
doesn't give me any comfort to know that the reason that I find out this very important
piece information, that affects my rights, that a company is doing, is because some twenty-something
was able to be smarter than the corporate gatekeepers of...that's not the way it's supposed
to work.
FEMALE: This is a question to Danny. Danny, this is Katrin, MobileActive, hey. From your
perspective, working with journalists on the ground, where do you see the biggest threats?
We hear about all of these different things on a daily basis: from shutdowns to censorship,
to surveillance, to certificate authorities being a complete mess. What do you hear from
people that you're working with? What are they telling you? What are the threats that
they perceive are most important that we all should be addressing?
O'BRIEN: I mean, there's such a huge spectrum, right. So there's these incredible high tech
attacks like, as you said, the attack on the certificate authority censorship system which
is basically another one of these really pivotal, important parts of protecting the privacy
of people online that is being, sort of undermined, by a coalition of repressive state actors
and criminal hackers. And then there's very simple stuff, right, where bloggers get detained.
And in Syria, we have roundups of bloggers who are physically threatened to handover
their Facebook passwords. And the detention and the violence is age old but this new twist
where, by handing over one thing or taking their phones, it unravels and allows people
to masquerade as them or just roundup all of their colleagues...is a new twist. But
I think, if I was to talk about a general thing that applies to this new, huge wave
of journalists and activists who use the internet as their primary communication medium, it's
actually isolation because most of those people are working on their own because they can.
As Sir Tim said, they don't need permission from anyone to publish what they're doing
or saying what they're saying. And therefore, they can act as a single actor to bring down
our governments. But the problem with that is, they don't have any institutional support.
They don't have any institutional support from large media organizations, and often
as human rights organizations, we don't have that chain of connection to those people because
they've popped up, not from the traditional activist community, but from where the trouble
is. And so I think what it behooves us to do is to spend a lot more time trying to accelerate
our ability to reach out and assist those people, wherever they come from.
CHAN: That's a great note to end on. We, unfortunately, are out of time but you are welcome to come
up and talk to the panelists and I'd like to ask for a warm round of applause.