Tip:
Highlight text to annotate it
X
1
00:00:00,362 --> 00:00:02,370
Whistle-blowing is a form of civil disobedience.
2
00:00:02,370 --> 00:00:06,759
The NSA is trying to vacuum up as much of the internet as possible.
3
00:00:06,759 --> 00:00:10,174
If you can find a way to block that attack, we'll give you an extra $50,000.
4
00:00:10,174 --> 00:00:12,259
We need to be ready; people are gonna try and exploit it.
5
00:00:12,259 --> 00:00:14,620
People always try to exploit something new, right?
6
00:00:14,620 --> 00:00:17,209
I know how viruses work. I know how they can get in.
7
00:00:17,209 --> 00:00:18,704
I just don't want to open that door.
8
00:00:18,704 --> 00:00:22,473
Even if they put a wet washcloth over my face and told me, "We need you to give up the source,"
9
00:00:22,473 --> 00:00:23,709
I couldn't do it!
10
00:00:28,677 --> 00:00:31,386
Hey, everybody, welcome to "The Engadget Show" for the month of August.
11
00:00:31,386 --> 00:00:33,527
I'm Brian Heater joined by Terrence O'Brien.
12
00:00:33,527 --> 00:00:38,073
And we're here on Wall Street, and do you kind of feel like somebody's watching you?
13
00:00:38,073 --> 00:00:41,659
Yeah, we're actually here to talk about security and surveillance, and we've come to
14
00:00:41,659 --> 00:00:44,654
one of the number one destinations in New York to get your pockets picked.
15
00:00:44,654 --> 00:00:45,652
Yeah.
16
00:00:45,652 --> 00:00:48,032
Gonna kick things off with an interview with John McAfee.
17
00:00:48,032 --> 00:00:50,702
He's fresh off a vacation in Belize.
18
00:00:50,702 --> 00:00:52,366
I hear it was a very relaxing time.
19
00:00:52,366 --> 00:00:54,363
He had a good time. Made it back home safe, though.
20
00:00:54,363 --> 00:00:57,539
It's true, he's back in the States. He's making internet videos now.
I don't know if you had a chance to check out
"How to Uninstall McAfee Antivirus."
22
00:01:00,865 --> 00:01:05,722
I did. I think it's gotta be one of my top-five films
of all time. "Citizen Kane" of YouTube, I think.
23
00:01:05,722 --> 00:01:08,506
Good news. "Citizen Kane 2" is about to come out.
24
00:01:08,506 --> 00:01:13,053
We went behind the scenes at his most recent YouTube video shoot, and gotta say,
25
00:01:13,053 --> 00:01:17,491
probably the weirdest segment in the history of "The Engadget Show."
26
00:01:17,491 --> 00:01:18,961
I'm very excited about it.
27
00:01:18,961 --> 00:01:21,267
-Let's go get a churro. -Okay.
28
00:01:43,624 --> 00:01:46,908
MAN: "McAfee 2." Roll one, scene one, take four.
29
00:01:46,908 --> 00:01:50,393
Oh, hello there. My name is John McAfee.
30
00:01:50,393 --> 00:01:54,088
I'm the founder of the McAfee Antivirus software company.
31
00:01:54,088 --> 00:01:58,734
Many of you may know me from my past careers as an international fugitive,
32
00:01:58,734 --> 00:02:03,218
collector of young women, one-time yoga master and antivirus guru.
33
00:02:03,218 --> 00:02:05,788
What I do is what I want, basically.
34
00:02:05,788 --> 00:02:08,536
If you can't have fun doing what you're doing, then do something else.
35
00:02:08,536 --> 00:02:11,451
I've always done that. When I started McAfee, I had fun.
I've had nothing to do with McAfee Software for over 15 years.
38
00:02:16,790 --> 00:02:20,534
I've had more pressing things to do.
39
00:02:20,534 --> 00:02:23,743
Uh, I'm bored with software. I'm bored with computers.
40
00:02:25,865 --> 00:02:29,043
The Urban Dictionary definition for McAfee:
41
00:02:29,043 --> 00:02:35,975
"McAfee: A barely-passable virus-scanning program that updates at the worst possible
times.
42
00:02:35,975 --> 00:02:40,067
"Tends to render a computer completely useless whenever it starts an update,
43
00:02:40,067 --> 00:02:44,979
which it doesn't ask to start and you cannot cancel or pause."
44
00:02:44,979 --> 00:02:49,824
Every antivirus piece of software consumes resources and time.
45
00:02:49,824 --> 00:02:52,666
It's the price you pay for a sense of security.
46
00:02:52,666 --> 00:02:55,391
I don't use any antivirus software, and never have.
47
00:02:55,391 --> 00:02:58,078
I have yet to have a virus. I'm being very honest with you.
48
00:02:58,078 --> 00:03:03,301
I just... I try to avoid doing things that would give me a virus.
49
00:03:03,301 --> 00:03:06,788
If I get email from someone I don't know, it goes into the trash.
Whether or not it's a tagger that wants to tag the building that's just been painted
54
00:03:21,041 --> 00:03:24,042
or someone trying to bring down the Pentagon.
55
00:03:24,042 --> 00:03:27,370
This is just the nature of reality, the nature of life.
56
00:03:27,370 --> 00:03:30,289
The premise for the first video was actually multifold.
57
00:03:30,289 --> 00:03:34,906
Number one, to point out the fact that look, please don't hang me for writing that software
00:03:34,906 --> 00:03:37,373
because I've had nothing to do with it for a long time.
59
00:03:37,373 --> 00:03:42,674
Although I've had nothing to do with this company for over 15 years, I still get volumes
of mail
60
00:03:42,674 --> 00:03:46,259
asking, "How do I uninstall this software?"
61
00:03:46,259 --> 00:03:48,210
I have no idea.
62
00:03:48,210 --> 00:03:53,295
The second, I took the opportunity just to make fun of all the labels that the press
has labeled me.
63
00:03:53,295 --> 00:03:55,711
"Paranoid," so I did that paranoid rant.
64
00:03:55,711 --> 00:03:58,957
I mean, it's always there, it's watching. It's been watching me for years.
65
00:03:58,957 --> 00:04:03,689
You know, "Surround myself with women and guns,"
well, I surrounded myself with women and guns.
66
00:04:07,997 --> 00:04:12,301
"A user of bath salts" which, seriously, I have enough money, even still,
67
00:04:12,301 --> 00:04:15,380
to buy good drugs if I wanted to do drugs,
68
00:04:15,380 --> 00:04:17,954
so why would I be doing bath salts, for heaven's sakes?
69
00:04:17,954 --> 00:04:20,122
You know, the worst drug on the planet.
And over half of the emails that I've received since the first email were
71
00:04:24,133 --> 00:04:26,755
"Damn, you're a badass! Keep this up."
72
00:04:26,755 --> 00:04:29,296
So I go, "Okay, I'll do one on badasses, then. Why not?"
73
00:04:29,296 --> 00:04:31,543
I have to give people what they want.
74
00:04:31,543 --> 00:04:35,579
Since I became a badass, my world has changed.
75
00:04:35,595 --> 00:04:38,543
Being exiled itself doesn't teach you much.
76
00:04:38,543 --> 00:04:42,064
I mean, that's just getting thrown on an airplane and landing in another country.
77
00:04:42,064 --> 00:04:45,908
But the month and a half that I spent evading the authorities
and running underground taught me a great deal.
79
00:04:48,031 --> 00:04:50,325
It's not the first time I've done that.
Back in the '70s, I was not a small-time drug dealer and spent a lot of time
81
00:04:54,232 --> 00:04:57,793
evading authorities in Mexico and Central America.
82
00:04:57,793 --> 00:04:59,837
So it's nothing new to me.
83
00:04:59,837 --> 00:05:03,509
The thing that it taught me is you have to know what you're doing when you're doing it
84
00:05:03,509 --> 00:05:05,417
and you have to understand the culture that you're in.
85
00:05:05,417 --> 00:05:09,164
I think the mistake that Snowden made was why didn't he stay in Hong Kong?
86
00:05:09,164 --> 00:05:12,041
I would love to be in Hong Kong if people were after me, you know?
87
00:05:12,041 --> 00:05:16,714
I'd get myself a tank top and just fade into the back alleys and start a new life.
88
00:05:16,714 --> 00:05:19,126
'Cause he's gonna have to start a new life somewhere.
89
00:05:19,126 --> 00:05:23,211
Going to Russia-- whoa! Who advised him to do that? Seriously.
90
00:05:23,211 --> 00:05:28,532
So, I've had to do what I do best-- help other people.
91
00:05:28,532 --> 00:05:32,609
I've now started the McAfee School for Badasses.
92
00:05:32,609 --> 00:05:34,405
Cut.
93
00:05:34,405 --> 00:05:36,309
-That's a good take. -That was good.
94
00:05:49,202 --> 00:05:52,378
Hey, coming up next we're heading to San Francisco to visit with EFF.
95
00:05:52,378 --> 00:05:55,589
-That's the Electronic Frontier Foundation. -Whoa!
96
00:05:55,589 --> 00:05:58,011
Dropped a little knowledge bomb on your head.
97
00:05:58,011 --> 00:06:01,377
Also, went to San Diego to speak to Cory Doctorow.
98
00:06:01,377 --> 00:06:04,480
Talked to Bruce Schneier, some top thinkers in the world of security.
99
00:06:04,480 --> 00:06:08,203
Yeah, and we also swung by "The New Yorker" to talk about their software called Strongbox.
100
00:06:08,203 --> 00:06:10,809
Lets people anonymously send them leaks and tips and information.
101
00:06:10,809 --> 00:06:13,480
Yeah, so stay secure out there.
102
00:06:13,480 --> 00:06:16,183
I know you're watching, Verizon.
103
00:06:33,044 --> 00:06:34,881
My name is Trevor Timm.
104
00:06:34,881 --> 00:06:40,222
I'm a digital rights analyst at the Electronic Frontier Foundation.
105
00:06:40,222 --> 00:06:46,354
We focus on all sorts of internet law, from free speech to privacy to copyright and fair
use
106
00:06:46,354 --> 00:06:49,640
and basically we fight for the user.
107
00:06:49,640 --> 00:06:55,767
We're doing a lot on NSA surveillance programs, domestic drones.
108
00:06:55,767 --> 00:06:59,833
Also, all sorts of free speech and transparency issues.
109
00:07:04,032 --> 00:07:07,007
The NSA revelations didn't really surprise us.
110
00:07:07,007 --> 00:07:10,345
We've actually been suing over this for about six years now.
111
00:07:10,345 --> 00:07:14,425
These are a lot of the same revelations that came out in 2005 and 2006
112
00:07:14,425 --> 00:07:17,363
under Bush's warrantless wiretapping program.
113
00:07:17,363 --> 00:07:22,880
The only difference here is the FISA Court-- the secret court that issues surveillance
orders--
is now basically sanctioning what the Obama administration is doing.
115
00:07:26,108 --> 00:07:29,711
Before, George Bush was just bypassing the court altogether.
116
00:07:29,711 --> 00:07:34,149
The FISA Amendments Act lets the government issue these broad surveillance orders
to these companies which they're forced to comply with.
118
00:07:36,317 --> 00:07:39,484
Yahoo! back in 2008 tried to secretly fight these in court.
119
00:07:39,484 --> 00:07:42,162
They unfortunately lost, so they started complying.
120
00:07:42,162 --> 00:07:45,880
Twitter tells their users if they received a surveillance order.
121
00:07:45,880 --> 00:07:49,353
They were even reported to be the only holdout of the major companies
122
00:07:49,353 --> 00:07:51,982
in the secret PRISM surveillance program.
123
00:07:55,305 --> 00:07:58,333
Edward Snowden is a hero. Total hero.
124
00:07:58,333 --> 00:08:00,241
Model for the future of activism.
125
00:08:00,241 --> 00:08:04,059
Right? Whistle-blowing is a form of civil disobedience.
126
00:08:04,059 --> 00:08:05,468
I'm Cory Doctorow.
127
00:08:05,468 --> 00:08:09,133
I write science-fiction novels, and I'm one of the editors and owners of Boing Boing.
128
00:08:09,133 --> 00:08:13,359
And I'm a journalist and I'm an activist and a fellow at the Electronic Frontier Foundation.
129
00:08:13,359 --> 00:08:17,637
We've had telescopes for a long time, but it's not inevitable that we'd all become peeping
Toms.
130
00:08:17,637 --> 00:08:20,240
We had a social contract about telescopes, right?
131
00:08:20,240 --> 00:08:24,286
People who look through telescopes through their neighbors' windows are creeps.
132
00:08:24,286 --> 00:08:28,320
Right? If you find out that your friend watches the neighbors through a telescope
133
00:08:28,320 --> 00:08:30,497
you shun your friend, right?
134
00:08:30,497 --> 00:08:33,049
The reason we have a surveillance state is not
because surveillance is inevitable
when you have computers. It's because spooks have no adult supervision,
right?
136
00:08:37,177 --> 00:08:41,395
They have subscribed to the greater haystack theory of terrorism prevention,
137
00:08:41,395 --> 00:08:45,310
which is to say that if you've got a small haystack with some needles in it,
138
00:08:45,310 --> 00:08:49,467
of terrorism that you can't find, the thing to do to make those needles easier to find
is to make the haystack bigger, on the grounds that more needles will somehow magically appear.
140
00:08:53,384 --> 00:08:56,558
You know, gathering information without any particularized suspicion
141
00:08:56,558 --> 00:09:01,677
in enormous mountains will somehow make terrorism more visible to you.
142
00:09:01,677 --> 00:09:04,465
And it's a... it's a... a nonsense, right?
143
00:09:04,465 --> 00:09:06,205
It's nonsense on stilts.
144
00:09:06,205 --> 00:09:10,365
The NSA is trying to vacuum up as much of the internet as possible.
145
00:09:10,365 --> 00:09:14,149
And right now they're building a facility in Utah which is going to be basically
146
00:09:14,149 --> 00:09:17,105
the biggest data storage facility ever built.
147
00:09:17,105 --> 00:09:22,255
This database probably has everybody's phone record from 2006 onward.
148
00:09:22,255 --> 00:09:26,052
You know, it's a question of how effective this even is, you know?
149
00:09:26,052 --> 00:09:30,796
If they're looking for a needle in a haystack, why add tons of more hay?
150
00:09:32,688 --> 00:09:37,680
It's pretty much impossible that all of the cell phone data of everybody in the United
States
151
00:09:37,680 --> 00:09:40,435
is relevant to any investigation.
152
00:09:40,435 --> 00:09:42,273
It's pretty much impossible.
153
00:09:42,273 --> 00:09:43,386
I'm Bruce Schneier.
154
00:09:43,386 --> 00:09:46,913
I basically work in the intersection of security, technology and people.
155
00:09:46,913 --> 00:09:48,804
What they're looking for, we don't know.
156
00:09:48,804 --> 00:09:52,969
Is it able to find terrorist plots? Almost certainly it's not.
157
00:09:52,969 --> 00:09:59,264
People who study big data say there's too much data, false positives kill you.
158
00:09:59,264 --> 00:10:02,064
There is no security; there's only security in a context.
159
00:10:02,064 --> 00:10:08,488
So am I anonymous in relationship to the people walking their dog over there? Yes.
160
00:10:08,488 --> 00:10:12,721
Am I anonymous in relationship to the phone company who can track my SIM as it moves
from tower to tower and my IMEI in case I swap SIMs? No.
162
00:10:16,562 --> 00:10:23,733
I think it's really scary for people-- especially innocent people-- who have no reason to believe
that they've done anything wrong, and yet the government has all this information.
164
00:10:27,131 --> 00:10:33,731
When you look at some of the stuff the NSA is doing now, they're bypassing the cryptography.
165
00:10:33,731 --> 00:10:37,527
Now, you have an encrypted channel between you and your Gmail account, right?
166
00:10:37,527 --> 00:10:39,921
It uses SSL.
167
00:10:39,921 --> 00:10:44,556
But if the NSA can go into Google and say, "Give me the email,"
168
00:10:44,556 --> 00:10:47,850
it doesn't matter if it's encrypted.
169
00:10:47,850 --> 00:10:52,055
We can encrypt things so that nobody can decrypt them, period.
170
00:10:52,055 --> 00:10:56,526
I mean, that... you know, computers the size of planets and millions of years,
171
00:10:56,526 --> 00:10:58,890
we can throw out all those big numbers.
172
00:10:58,890 --> 00:11:02,624
But as soon as you start building actual systems--
173
00:11:02,624 --> 00:11:06,750
software on computers on networks used by people--
174
00:11:06,750 --> 00:11:11,112
suddenly the math becomes just a small part of overall security.
175
00:11:11,112 --> 00:11:14,598
And there are many, many ways to break into a system.
176
00:11:14,598 --> 00:11:19,005
I have full-disc encryption on a Linux box that I carry around in my bag,
177
00:11:19,005 --> 00:11:25,641
so if I lose the machine the data is proof against anyone except a state-level actor.
178
00:11:25,641 --> 00:11:28,968
I have an encrypted backup drive that I travel with and another one on my desk,
179
00:11:28,968 --> 00:11:31,057
so that's my personal data strategy.
180
00:11:31,057 --> 00:11:37,534
You know, you can download Tor which is an anonymizer, so it basically hides your location
of where your computer is.
181
00:11:37,534 --> 00:11:40,973
OTR chat capabilities-- stands for "Off The Record."
182
00:11:40,973 --> 00:11:45,945
It encrypts your chats so that the services can't actually see what you're saying.
183
00:11:45,945 --> 00:11:51,051
You can also download email encryption tools like PGP encryption,
184
00:11:51,051 --> 00:11:52,819
which stands for "Pretty Good Privacy."
185
00:11:52,819 --> 00:11:58,313
These tools are still fairly hard to use, but if you can master them and get your friends
to master them,
186
00:11:58,313 --> 00:12:04,137
it does provide a way to talk securely online.
187
00:12:04,137 --> 00:12:06,719
PGP is still way too hard to use for civilians.
188
00:12:06,719 --> 00:12:08,220
I won't pretend it's not.
189
00:12:08,220 --> 00:12:11,993
It's worth the time you invest, but that's a couple hours that everybody's like,
190
00:12:11,993 --> 00:12:13,654
"Well, I could be watching 'Doctor Who.'"
191
00:12:13,654 --> 00:12:19,676
If we spent as much time figuring out the UI for PGP,
like as much human hours as we did for "FarmVille,"
192
00:12:19,676 --> 00:12:24,217
we would have totally wicked email security by default everywhere.
193
00:12:24,217 --> 00:12:28,969
Most cloud services are more secure because the people who run them
do a better job than the users at security.
195
00:12:31,707 --> 00:12:34,642
But we're not talking about security against attackers.
196
00:12:34,642 --> 00:12:39,556
I mean there's great reasons why you do this-- performance, convenience.
197
00:12:39,556 --> 00:12:45,355
We love Gmail, but you're still trusting Google with your mail.
198
00:12:45,355 --> 00:12:49,307
And Google can give a copy to the government. You can't stop them.
199
00:12:49,307 --> 00:12:55,567
The best and simplest thing people can do, if they're looking to help solve
200
00:12:55,567 --> 00:12:59,353
this NSA surveillance problem is call their representative.
201
00:12:59,353 --> 00:13:02,975
There are half a dozen bills in Congress right now
that need supporters,
202
00:13:02,975 --> 00:13:07,276
and the only way representatives are gonna start supporting them
203
00:13:07,276 --> 00:13:10,644
is if they hear from their constituents that they really care about this issue
and that their job in Congress actually depends on it.
205
00:13:13,972 --> 00:13:16,479
Congress definitely has the power to set up these courts,
206
00:13:16,479 --> 00:13:18,959
and it's definitely in their power to rein them in.
207
00:13:18,959 --> 00:13:23,338
You can throw away your computer and your cell phone, but good luck.
208
00:13:25,965 --> 00:13:29,248
I'm Nicholas Thompson. I'm the Editor of newyorker.com.
209
00:13:32,401 --> 00:13:34,773
So, we invented a thing called Strongbox.
210
00:13:34,773 --> 00:13:39,424
The idea of Strongbox is to create a secure way--
for sources or for anybody--
211
00:13:39,424 --> 00:13:42,118
to communicate with us, without there being any way
212
00:13:42,118 --> 00:13:44,859
of anybody being able to figure out where that person came from
213
00:13:44,859 --> 00:13:46,360
or how they reached out to us.
214
00:13:46,360 --> 00:13:49,891
So, people can talk to journalists without having
to risk their lives, or...
215
00:13:49,891 --> 00:13:52,479
We'll be able to get out documents, we'll be able to expose corruption,
216
00:13:52,479 --> 00:13:54,893
we'll be able to, you know, be whistle-blowers.
217
00:13:54,893 --> 00:13:57,889
So, the way it works is this: I'm a source.
218
00:13:57,889 --> 00:14:01,290
I'm a person or I'm somebody. I want to get information to "The New Yorker."
219
00:14:01,290 --> 00:14:04,370
So, I say, "Okay, I can log on through Tor."
220
00:14:04,370 --> 00:14:06,877
Tor is an identity-protection system.
221
00:14:06,877 --> 00:14:10,336
The document is PGP-encrypted, so it's hard, even if somebody were to find it
222
00:14:10,336 --> 00:14:11,813
to figure out what's in the document.
223
00:14:11,813 --> 00:14:13,759
Then it comes to us.
224
00:14:13,759 --> 00:14:17,848
We take it and we take a little hard drive, we plug it into a computer.
225
00:14:17,848 --> 00:14:20,654
We put the document onto our hard drive where it's still encrypted.
226
00:14:20,654 --> 00:14:22,727
Then we go to a second computer.
227
00:14:22,727 --> 00:14:25,978
The second computer is not connected to the internet and it doesn't have a hard drive
itself.
228
00:14:25,978 --> 00:14:29,075
And we put in the little thumb drive and then we finally decrypt it.
229
00:14:29,075 --> 00:14:34,146
So by the time we decrypt the document and see what it is, it's traveled securely,
230
00:14:34,146 --> 00:14:37,748
entirely encrypted, to something that has an air brake from the internet,
231
00:14:37,748 --> 00:14:39,706
and you've been protected through Tor.
232
00:14:39,706 --> 00:14:43,064
Meanwhile, they also built a whole lot of code
to protect the system,
233
00:14:43,064 --> 00:14:46,314
protect the integrity of it and make it harder for people to hack in.
234
00:14:46,314 --> 00:14:48,310
But it means that if somebody were to come to us and say,
235
00:14:48,310 --> 00:14:51,543
"Hey! That document you have, we really want to find out where it came from,"
236
00:14:51,543 --> 00:14:56,235
we would just say, "We have no idea and there's no way we can figure it out."
237
00:14:56,235 --> 00:14:58,318
Strongbox is two things. One, it's a one-way channel.
238
00:14:58,318 --> 00:15:00,559
Somebody can send us a document and walk away.
239
00:15:00,559 --> 00:15:03,228
We'll have no way of ever finding them, we'll have no mechanism of finding them.
240
00:15:03,228 --> 00:15:06,397
But they also, if they want to engage, they can send us a message and say
241
00:15:06,397 --> 00:15:08,808
"Hey, I want to hear back from you." And then we'll write back to them.
242
00:15:08,808 --> 00:15:12,732
We won't have a direct email address or anything. We'll just put a little message on a bulletin
board.
243
00:15:12,732 --> 00:15:15,993
They-- through a secret password that only they have--
will be able to read it.
244
00:15:15,993 --> 00:15:18,589
So there will be some mechanism for two-way communication.
245
00:15:18,589 --> 00:15:23,069
Again, even if they engage in that, we won't have any way of figuring out who they are
unless they tell us.
246
00:15:23,069 --> 00:15:25,747
We can say, "Look, our journalists will go to jail protecting your identity;
247
00:15:25,747 --> 00:15:28,730
we won't give up information that will reveal your identity,"
248
00:15:28,730 --> 00:15:31,487
but still the government has all kinds of powerful tools of coercion.
249
00:15:31,487 --> 00:15:33,963
The best thing we can say is,
250
00:15:33,963 --> 00:15:39,162
"We couldn't even find you if we wanted to, so they can't get your information from us."
251
00:15:39,162 --> 00:15:41,872
Even if they put a, you know, wet washcloth over my face and told me
252
00:15:41,872 --> 00:15:45,032
"We need you to give up the source," I couldn't do it.
253
00:15:45,032 --> 00:15:47,267
We do know the identity of a lot of leakers,
254
00:15:47,267 --> 00:15:50,101
and there is something to be said for learning their identities.
255
00:15:50,101 --> 00:15:53,832
It makes it seem a little more real. We can understand where their passion came from.
256
00:15:53,832 --> 00:15:57,134
We can understand why they wanted to talk about these things.
257
00:15:57,134 --> 00:16:00,408
I edited stories by... Ryan Lizza, a political reporter...
258
00:16:00,408 --> 00:16:04,483
all kinds of secret documents that he acquired, were given to him, that he found out.
259
00:16:04,483 --> 00:16:09,000
There's a long, long history of investigative reporting here
that has relied on documents.
261
00:16:10,482 --> 00:16:14,316
Jane Mayer's used it a whole bunch in exposure of abuses in the secret detainee program.
262
00:16:14,316 --> 00:16:18,946
There are a lot of reporters... Sy Hersh absolutely has broken all kinds of things throughout
his career.
263
00:16:18,946 --> 00:16:22,886
So, there are a number of people here who rely on secret documents, who rely on sources
who trust that their contributions will be kept anonymous, and this is another mechanism
for communication.
265
00:16:28,908 --> 00:16:31,947
The system is open source. The code that went into it is entirely open source.
266
00:16:31,947 --> 00:16:34,476
Anybody else can build it. They have to download the code,
267
00:16:34,476 --> 00:16:37,213
they have to buy some old laptops, and they have to buy some thumb drives.
268
00:16:37,213 --> 00:16:41,284
The desires of the people who built it was that this would start at "The New Yorker"
269
00:16:41,284 --> 00:16:42,866
but then it would spread places.
270
00:16:42,866 --> 00:16:47,384
So, we're gonna do what we can to continue to let people know that this is out there.
271
00:16:57,998 --> 00:17:00,336
So, I think we can both agree that in the past,
272
00:17:00,336 --> 00:17:04,125
fictional depictions of computer security and hacking--
tad on the absurd side.
273
00:17:04,125 --> 00:17:06,957
I mean, "Hackers" is one of my top-five films of all time,
274
00:17:06,957 --> 00:17:09,348
but for argument's sake, I'm gonna give you this one, Terrence.
275
00:17:09,348 --> 00:17:12,020
Well, Ubisoft has a new game coming out. It's called "Watch Dogs."
276
00:17:12,020 --> 00:17:15,821
And they've worked very closely with the Russian security firm called Kaspersky
277
00:17:15,821 --> 00:17:18,521
to make sure that the game is as realistic as possible.
Sure, you can take all those hacking skills that you've learned playing video games
279
00:17:22,369 --> 00:17:26,371
and you can use them to break into Microsoft's different software systems
280
00:17:26,371 --> 00:17:28,774
and maybe put a little cash on the side.
281
00:17:28,774 --> 00:17:30,895
You considering a career switch?
282
00:17:30,895 --> 00:17:33,408
-Let's talk when the camera's off. -Okay.
283
00:17:36,289 --> 00:17:41,219
NEWSCASTER: A citywide manhunt is underway for the suspected vigilante Aiden Pearce.
00:17:41,219 --> 00:17:45,102
Engaged in several bold interventions, Pearce has divided the city, with locals praising
his actions...
00:17:45,102 --> 00:17:48,348
-(cell phone dials) -OPERATOR: 911. State your emergency, please.
00:17:48,348 --> 00:17:49,941
MAN: Lose it!
00:17:49,941 --> 00:17:53,068
KEVIN SHORTT: "Watch Dogs" is about a guy named Aiden Pearce.
288
00:17:53,068 --> 00:17:56,781
He's a modern-day vigilante who made some mistakes in his past
289
00:17:56,781 --> 00:18:00,392
and those mistakes bit him in the ***, hurt his family,
290
00:18:00,392 --> 00:18:04,863
and now he's out to find out who hurt his family, and he wants to protect his family
291
00:18:04,863 --> 00:18:06,507
and make sure it never happens again.
292
00:18:06,507 --> 00:18:09,926
My name's Kevin Shortt. I'm the lead story designer on "Watch Dogs."
293
00:18:09,926 --> 00:18:14,413
And I've been on the project for... since the beginning, about five years.
294
00:18:14,413 --> 00:18:19,426
Right at the beginning, during conception, we very quickly decided that we wanted something
new,
295
00:18:19,426 --> 00:18:22,041
we wanted new game dynamics for players.
296
00:18:22,041 --> 00:18:26,218
And that became, "Let's make a city that players can control."
297
00:18:26,218 --> 00:18:28,970
And the way you control it, that quickly became hacking.
298
00:18:28,970 --> 00:18:31,336
And then of course we all had our cell phones and we realized,
299
00:18:31,336 --> 00:18:35,198
well, that's the tool, that's the thing we're gonna use, because cell phones are everywhere
they're an inconspicuous weapon, and out of that, all our ideas grew.
301
00:18:39,039 --> 00:18:45,047
"Watch Dogs" is very special in the sense that every entertainment project
is based on using a fantasy.
303
00:18:47,245 --> 00:18:52,139
"Watch Dogs" has this very peculiar thing in that this fantasy is reality.
304
00:18:52,139 --> 00:18:57,425
We're showcasing a reality people don't see usually or choose not to see.
305
00:18:57,425 --> 00:19:01,175
Because we all know the risks of using phones
and being connected and having accounts and everything,
307
00:19:03,391 --> 00:19:08,124
but we decide not to be too cautious or not to be too worried about it.
308
00:19:08,124 --> 00:19:09,822
My name is Thomas Geffroyd.
309
00:19:09,822 --> 00:19:14,900
I'm dealing with authenticity of the content and also consistency of the universe.
310
00:19:14,900 --> 00:19:18,782
The one thing we really wanted to make sure was that anything that we say you can hack
311
00:19:18,782 --> 00:19:22,093
in our game is not a magic power, it's not a super-power.
312
00:19:22,093 --> 00:19:24,567
It's grounded in some sort of reality.
313
00:19:24,567 --> 00:19:27,042
We've found research that says, "Okay, this is possible."
314
00:19:27,042 --> 00:19:30,829
At the time when we started, some of them felt
a bit near-future, you know?
315
00:19:30,829 --> 00:19:34,308
They were theoretically possible, but we hadn't really seen them.
316
00:19:34,308 --> 00:19:36,933
We did a hell of a lot of research on hacking.
317
00:19:36,933 --> 00:19:40,280
We knew rudimentary stuff, like everybody, but we started looking into
318
00:19:40,280 --> 00:19:45,009
just all the types of hacks that are out there that we didn't even know existed.
319
00:19:45,009 --> 00:19:49,602
Basically, the security and hacking community is a very open one,
320
00:19:49,602 --> 00:19:53,099
so there is a lot of information you can get if you look a little into it.
321
00:19:53,099 --> 00:19:58,450
Then we also reached out to some pros in the field, one of those being Kaspersky Labs.
322
00:19:58,450 --> 00:20:01,594
What we ended up doing was we sort of showed them our game design ideas,
323
00:20:01,594 --> 00:20:03,523
and then we sent them the full script.
324
00:20:03,523 --> 00:20:06,574
And basically we had an incredible feedback from them.
325
00:20:06,574 --> 00:20:11,119
Most of their analysts are gamers, also, so they were thrilled to work with us.
326
00:20:11,119 --> 00:20:14,789
And they just wanted to help us get some of the language right and make sure
327
00:20:14,789 --> 00:20:17,666
the tone is right and that it's being as true as possible,
328
00:20:17,666 --> 00:20:22,165
while also being a game that's gonna be fun and exciting for players to play.
329
00:20:22,165 --> 00:20:25,010
When we first started this project, we zeroed in quickly on Chicago
330
00:20:25,010 --> 00:20:31,339
'cause we realized Chicago's the kind of city that would embrace a new idea like CtOS.
331
00:20:31,339 --> 00:20:34,238
CtOS stands for "Central Operating System."
332
00:20:34,238 --> 00:20:39,637
It's basically a smart system so everything in the city--
the electricity grid, water,
333
00:20:39,637 --> 00:20:45,450
communications-- it's all centralized and controlled through one major system.
334
00:20:45,450 --> 00:20:49,864
And what that does is it makes commutes for citizens
a lot faster.
335
00:20:49,864 --> 00:20:51,878
All the traffic lights are perfectly synchronized.
Hydrobills, all that sort of thing, they pay less.
337
00:20:55,732 --> 00:20:57,823
And communications are a lot quicker.
338
00:20:57,823 --> 00:21:03,116
It has a system where the police can anticipate where crimes may or may not happen.
339
00:21:03,116 --> 00:21:06,230
Which is great for the citizens, great for the cops, 'cause they don't have to have
340
00:21:06,230 --> 00:21:10,603
as many people on the streets, they can just react much quicker.
341
00:21:10,603 --> 00:21:16,552
It's also great for somebody like Aiden Pearce who can hack into the system and take advantage
of that.
342
00:21:16,552 --> 00:21:20,283
I think smart cities are coming. I think they're definitely coming.
343
00:21:20,283 --> 00:21:22,855
And you know, we need to be ready. People are gonna try and exploit it.
344
00:21:22,855 --> 00:21:25,856
People always try to exploit something new, right?
345
00:21:25,856 --> 00:21:30,243
We should embrace this approach of smart cities, but as we showcase,
346
00:21:30,243 --> 00:21:34,468
we have to be thinking about what it means also and how we can prevent
347
00:21:34,468 --> 00:21:37,500
anything bad from happening with those grid systems.
348
00:21:37,500 --> 00:21:42,711
It's important to feel secure, because the concept of the Maslow pyramid
349
00:21:42,711 --> 00:21:45,421
puts security at the bottom of it.
350
00:21:45,421 --> 00:21:47,211
Basically it's a basic need.
351
00:21:47,211 --> 00:21:51,221
You have to feel secure to be okay and to live
a fulfilling life.
352
00:21:51,221 --> 00:21:55,952
The biggest thing we want out of this is we want players to finish the game and have a
dialogue
353
00:21:55,952 --> 00:21:57,527
about what's going on.
354
00:21:57,527 --> 00:22:01,042
'Cause it's not really our place to say whether tech is good or bad,
355
00:22:01,042 --> 00:22:04,286
whether smart cities are right or wrong,
356
00:22:04,286 --> 00:22:10,044
but really it's something that we think is important to discuss it, to talk about it.
357
00:22:10,044 --> 00:22:12,921
Because we're moving fast. Technology's moving so fast.
358
00:22:12,921 --> 00:22:16,131
And it's worthwhile for us to kind of slow down and go, ho, ho, ho, okay.
359
00:22:16,131 --> 00:22:19,651
What do we think about this? What are the repercussions of this?
360
00:22:19,651 --> 00:22:22,140
Am I as secure as I should be?
361
00:22:22,140 --> 00:22:25,648
Who'll watch the watchdogs? Who is watching those guys?
362
00:22:25,648 --> 00:22:28,224
Who will make sure that it's used right?
363
00:22:28,224 --> 00:22:29,697
We can't go back.
364
00:22:29,697 --> 00:22:33,605
We can't suddenly, "Okay, let's unplug everything" and hide in caves.
365
00:22:33,605 --> 00:22:36,662
That's our world, that's what we live in, so we just have to be smart about it
366
00:22:36,662 --> 00:22:40,887
and put up the best protection that we can.
367
00:23:10,644 --> 00:23:12,532
Everybody's got phones, phones, phones.
368
00:23:12,532 --> 00:23:14,426
Everybody's got phones, phones, phones.
369
00:23:14,426 --> 00:23:16,360
Everybody's got phones, phones, phones.
370
00:23:16,360 --> 00:23:20,527
Everybody's got phones, phones, phones, phones, phones.
371
00:23:20,527 --> 00:23:22,267
Everybody's got phones.
372
00:23:22,267 --> 00:23:24,401
I'm sick of my phone.
373
00:23:24,401 --> 00:23:25,932
I love my phone.
374
00:23:25,932 --> 00:23:29,360
I tried to make all those hippie pacts with myself like
375
00:23:29,360 --> 00:23:33,385
"I'm not gonna look at my smartphone when I'm at dinner with my friends."
"I'm going to restrict myself from using my smartphone
377
00:23:37,020 --> 00:23:42,018
"because I want to be a living, breathing human being, a citizen of the world
378
00:23:42,018 --> 00:23:47,401
who talks to real live people and doesn't just stare
at his gizmo."
379
00:23:47,401 --> 00:23:50,689
But I have violated all my own pacts.
380
00:23:50,689 --> 00:23:55,805
I don't care anymore. I have no personal integrity.
381
00:23:55,805 --> 00:24:02,153
If I am stopped at a red light, I will use that time to look at Twitter,
382
00:24:02,153 --> 00:24:07,030
and that is not an expansive amount of time, sitting at a red light.
383
00:24:07,030 --> 00:24:13,573
That is not a period of time that I used to rue the waste of.
384
00:24:13,573 --> 00:24:18,269
I did not used to sit at stoplights and go, "God! I could be doing something!"
385
00:24:18,269 --> 00:24:21,789
I'd just sit there. Not now.
386
00:24:21,789 --> 00:24:27,027
I literally look at my smartphone
387
00:24:27,027 --> 00:24:32,435
while my infant daughter is begging me for food.
388
00:24:32,435 --> 00:24:35,619
I feel like I can do both things.
389
00:24:35,619 --> 00:24:40,544
I can feed her and raise her well and be looking at my smartphone the whole time.
390
00:24:40,544 --> 00:24:43,057
I don't see a conflict anymore.
391
00:24:43,057 --> 00:24:48,148
I would have a couple of years ago, when I was a hippie.
392
00:24:48,148 --> 00:24:50,338
Everybody's got phones, phones, phones.
393
00:24:50,338 --> 00:24:52,214
Everybody's got phones, phones, phones.
394
00:24:52,214 --> 00:24:53,972
Everybody's got phones, phones, phones.
395
00:24:53,972 --> 00:24:55,440
Everybody's got phones, phones, phones.
396
00:24:55,440 --> 00:24:55,794
Everybody's got phones, phones, phones.
397
00:24:55,794 --> 00:24:58,092
Everybody's got phones, phones, phones.
398
00:25:00,431 --> 00:25:04,160
My name is Katie Moussouris, and I'm a senior security strategist for Microsoft.
399
00:25:07,159 --> 00:25:09,655
Our programs are pretty new.
400
00:25:09,655 --> 00:25:12,099
We've been running ours for about three weeks now.
401
00:25:12,099 --> 00:25:16,029
It will be interesting to see if people actually start coming to us
and making a living from our bounty programs.
403
00:25:18,376 --> 00:25:22,212
The three bounties that we're offering are the Mitigation Bypass Bounty,
404
00:25:22,212 --> 00:25:26,305
and that's up to $100,000 for a brand-new exploitation technique.
405
00:25:26,305 --> 00:25:30,410
If we can learn about these techniques and kind of learn about the holes in the shield,
406
00:25:30,410 --> 00:25:35,227
we can protect against entire classes of vulnerabilities, which is why the payment is so high.
407
00:25:35,227 --> 00:25:39,414
That's an ongoing program, and it's not tied to any contest or special event.
408
00:25:39,414 --> 00:25:41,812
Though we will be offering some live judging
409
00:25:41,812 --> 00:25:45,293
at the Black Hat Conference at the Microsoft booth around noon every day.
410
00:25:45,293 --> 00:25:49,132
The second program is called the BlueHat Bonus For Defense,
411
00:25:49,132 --> 00:25:52,615
and that is up to $50,000 for a defensive idea.
412
00:25:52,615 --> 00:25:58,165
So the idea is, if you find a new way to bypass our platform-wide defenses,
413
00:25:58,165 --> 00:26:02,525
if you can find a way to block that attack, we'll give you an extra $50,000.
414
00:26:02,525 --> 00:26:07,819
And then the third program is the first 30 days of the IE11 preview period,
415
00:26:07,819 --> 00:26:14,826
we are offering up to $11,000 for vulnerabilities in IE11 preview.
416
00:26:14,826 --> 00:26:19,370
These are the first programs in Microsoft's history to offer direct cash payments
417
00:26:19,370 --> 00:26:24,364
in exchange for vulnerability information and information about exploitation techniques.
418
00:26:24,364 --> 00:26:28,412
As early as we can learn about what we call these holes in the shield,
419
00:26:28,412 --> 00:26:30,359
the better for us and our customers.
420
00:26:30,359 --> 00:26:34,870
We were basically trying to increase the win-win between the security researcher community
421
00:26:34,870 --> 00:26:36,395
and Microsoft's customers.
422
00:26:36,395 --> 00:26:39,822
That's actually my whole job at Microsoft is working with the hacker community.
423
00:26:39,822 --> 00:26:42,891
And I, you know, sort of grew up in this community myself
424
00:26:42,891 --> 00:26:48,920
and, you know, learned from some of the great minds in the Boston area.
425
00:26:48,920 --> 00:26:52,222
I grew up around some of the folks that were in the L0pht,
426
00:26:52,222 --> 00:26:57,314
and I worked as a pen tester with @stake, which was founded by the L0pht.
427
00:26:57,314 --> 00:27:02,510
So, the hacker community can be pretty diverse and pretty interesting.
428
00:27:02,510 --> 00:27:07,116
We're really looking for people who can bring that new threat horizon
429
00:27:07,116 --> 00:27:11,852
to the attention of the developers and executives here at Microsoft.
430
00:27:11,852 --> 00:27:17,895
As a previous security researcher and penetration tester myself before I joined Microsoft,
431
00:27:17,895 --> 00:27:21,349
all the way to now the past six years where I've
been with Microsoft,
432
00:27:21,349 --> 00:27:24,696
it's been really interesting to watch the evolution take place
433
00:27:24,696 --> 00:27:27,596
from, you know, the Trustworthy Computing memo
that Bill Gates wrote
434
00:27:27,596 --> 00:27:31,492
and that really put the impetus on the company to take security seriously,
435
00:27:31,492 --> 00:27:36,656
and you know, the evolution of all these outreach programs beginning in 2005
436
00:27:36,656 --> 00:27:41,154
with the first researcher appreciation party that Microsoft threw at Black Hat.
437
00:27:43,784 --> 00:27:48,421
I think the hacker community really wants to do the right thing, for the most part.
438
00:27:48,421 --> 00:27:52,951
They want to improve the security of the programs that we use every day,
439
00:27:52,951 --> 00:27:56,772
and that's why they tend to look for vulnerabilities in them.
440
00:27:56,772 --> 00:27:58,574
It's kind of what they do.
441
00:27:58,574 --> 00:28:02,067
And those who are interested in helping secure the products,
442
00:28:02,067 --> 00:28:04,679
they can come to us for the bounties at this point.
443
00:28:07,696 --> 00:28:10,735
Well, I don't know about you, Terrence, but I sure feel more secure.
444
00:28:10,735 --> 00:28:13,986
Me, too. This episode is like a security blanket. I feel all warm and fuzzy inside.
445
00:28:13,986 --> 00:28:17,097
Speaking of warm and fuzzy things, got a lot of people we need to thank.
446
00:28:17,097 --> 00:28:20,281
The Electronic Frontier Foundation, Cory Doctorow, Bruce Schneier.
447
00:28:20,281 --> 00:28:24,004
Not to mention Microsoft, Ubisoft and "The New Yorker."
448
00:28:24,004 --> 00:28:26,804
Yeah, um... oh, and of course John McAfee
for easily one of the weirdest things I've ever experienced in my entire life.
450
00:28:30,677 --> 00:28:34,916
Oh, and speaking of weird people named John, as per usual, John Roderick.
451
00:28:34,916 --> 00:28:37,048
He's a strange and lovely man.
452
00:28:37,048 --> 00:28:40,016
And on the note of lovely men, thanks to Tim Stevens.
453
00:28:40,016 --> 00:28:43,145
Yep, thanks for guiding us for a couple of years now.
454
00:28:43,145 --> 00:28:46,146
Sure. Thanks to you, Terrence, for filling in those giant shoes he has.
455
00:28:46,146 --> 00:28:48,358
-You're welcome. -Size 24.
456
00:28:48,358 --> 00:28:50,230
It's freakish, really.
457
00:28:50,230 --> 00:28:53,422
Thanks to you guys for joining us. We'll be back very soon.
458
00:28:53,422 --> 00:28:57,217
And in the meantime, we've got a train to catch.
459
00:28:57,217 --> 00:29:00,335
Yes, yes, we do.
460
00:29:00,335 --> 00:29:02,050
Separate trains.
461
00:29:02,050 --> 00:29:03,634
(laughing)