Tip:
Highlight text to annotate it
X
[ Music ]
>> Welcome.
Welcome to the Statewide IT Conference.
We've been looking forward to this day for a long time.
It's one of the few times that can get as many as possible of the one IU IT team together.
We've got a great day lined up today and tomorrow.
Some of you were here for pre-conference sessions yesterday.
This evening the staff appreciation event out at the cyber infrastructure building.
And I also want to give a shout out.
I know we've got some people watching the stream at various places as well.
First and foremost, the event would not be possible in any way without our sponsors.
I want to give a special shout out to Matrix Integration, who is our premier sponsor
and sponsoring for the two keynotes.
I saw Brenda and Chad here this morning.
We are immensely grateful to you for your long and productive partnership with Indiana University.
Also, to ATT, Lynda.com, Dell, CDGG and Technology Integration.
They are immensely helpful in pulling off an event of this caliber and in this scale.
I don't know what the final registration numbers are.
Last I'd heard we were around 750.
I know last year we had over 750 walk in the door here.
So it really is a great gathering for the family.
During the break outside, I hope you will take a little time to visit their booth.
They've set up booths, we've got some others from UITS showing off various things with IT training
or some of the emerging technologies as well.
So our theme this year is Always Learning.
Now, we're a university, so we teach.
And education is a core part of what we do.
But in selecting the theme this year, Always Learning, this is not about our primary mission
as a research and education university.
It's always learning about you.
What are you personally learning in developing your career in terms of new skills,
new insights in how to lead, new insights in how to manage and work with others.
We are an organization that is always learning.
And we're going to lead off this morning with what I think would just be an extraordinary panel.
As you saw in the opening comments, this is really an interesting time in our history,
in the history of our nation in the balance between privacy and security in a time
of what increasingly is turning out to be a cyberwar world.
And the news is filled every day with new revelations from Edward Snowden and others
about what's going on with our government.
So let me introduce our panelists.
I'm delighted this morning to welcome Congressman Lee Hamilton.
Lee Hamilton has served with grace and dignity for many years as Indiana Congressman.
First elected in 1965, as I recall, and served through 1999 and also served
as the vice-chair of the 911 Commission.
Lee, come on out and join us, please.
[ Applause ]
>> Hi Brad.
Glad to be with you.
>> Thank you.
>> Thank you, sir.
>> Our second panelist is Associate Professor Raquel Hill
from the School of Informatics and Computing.
I'm delighted that Raquel was willing to come back.
She is on sabbatical this year.
She works for the Center for Applied Cybersecurity Research here at Indiana University.
This semester she is visiting at Harvard University at the Center
for Research on Computation and Society.
Raquel, come and joint us.
Welcome.
[ Applause ]
And our third panelist needs almost no introduction.
Professor Fred Cate from the Maurer School of Law.
Also the Director of the Center for Applied Cybersecurity Research and a number of titles
that would reach pretty much from here to the Indianapolis Airport
where you will find him many days en route to somewhere in the world or Washington, DC.
Fred, come and join us.
[ Applause ]
Thank you very much.
When we first decided to focus on privacy and security and suggested Fred, of course,
as one of our great experts in the area, one of the staff quipped
and said well, we've had Fred here before.
Should we just rename it the Cate-wide IT conference.
And I said no, we'll stick with the classic naming.
There is so much going on.
My role today is simply to serve as moderator as we hear from our panelists.
This topic of privacy, security and increasingly what seems cyberwar,
as Richard Clarke wrote about.
The news seems to be filled with something every day.
So Congressman Hamilton, perhaps an opening comment.
What's on your mind as you see an unending chain of news stories about these topics?
>> Brad, thank you very much.
I'm delighted to be with all of you this morning.
Can't see any of you, but I'm delighted to be with you, and of course, with Raquel and Fred.
Let me make several comments very quickly.
First of all, I have to acknowledge a certain ambivalence
about these massive monitoring and surveillance techniques.
I cherish a zone of privacy.
I get upset when people invade my privacy.
But at the same time, I recognize that the threats and the vulnerabilities to our government are real
and that there are legitimate reasons for a government to monitor cyberspace.
I do not think we should abolish the NSA programs.
On the other hand, I don't think we have had nearly the checks and balances
on those programs that we should have had.
I'll get into that in discussion.
Second point would be that this surveillance monitoring is a massive extension
of government power.
It is without precedent in my lifetime and in yours.
All of this collection of data is unbelievably sophisticated.
The activities of the NSA have expanded much faster than its judgment.
And what has astounded me is that it has so little public debate or discussion.
The secrecy and the passivity and the timidity of the United States Congress is simply astounding.
These programs have been in development for almost a decade now.
And members of Congress, some of whom better informed than others, have not seen fit during
that period of time even to have a public debate.
They've kept it secret.
I don't approve of that in any way.
The Congressional overseers of the intelligence community have been captivated, if not captured,
by those they are supposed to be supervising.
The next point is that my chief concern in all of this is the potential for abuse.
You may be willing to concede, I am not,
that those who have conducted these programs have conducted themselves carefully and correctly.
But even if that is true, the potential for abuse here is unlimited.
And once you give government power, that government rarely relinquishes power.
So we have to think here in a very long timeframe.
Not just right now, but decades in the future.
And the potential for abuse is large.
The next point is the policies are not basically going to change.
Every day now you read about it in the paper that Congress is doing this, that and the other.
I'll shorten this comment, Brad, but the basic point is the policy of massive collection
of data will not fundamentally change.
There are a lot of reasons for that.
I'll not go into them.
There's beginning to be more stirring of debate now.
You read about it in the paper this morning.
But it's still not close to any kind of congressional action.
Now what can we do in all of this?
Well, we can ask the right questions.
Can the scope of these intrusions into people's lives be minimized?
Can we collect all this data that isn't necessary to collect it?
If we do collect it, can we do it with less intrusion?
We need greater oversight.
We need greater transparency.
We need more information.
We need more public debate.
And we need more constraints on these programs.
Power that is secret is dangerous.
Power that is secret should be avoided.
If it is necessary, then it must be subject to the Constitutional balances, and that means review
by the Congress, by the judiciary.
Both the Congress and the judiciary have been very slack in their overview of these programs.
The final point would be simply this, that I think that with the vast expansion
of intelligence agencies, with the aggressive exploitation of these intrusive powers,
with the astounding surveillance and monitoring capabilities that government now has,
with the excessive classification of public information, you and I are confronted
with an unprecedented challenge to hold government accountable
and to constrain its reach into the lives of all of us.
The price of liberty is eternal vigilance.
>> Thank you [inaudible].
[ Applause ]
>> Raquel, you're in a unique position as a professor of computer science and informatics,
and one might say all these technology thingies of yours have caused this havoc.
What are you learning and thinking about these days while you're visiting at the Harvard Center.
>> Okay. Brad, first I want to thank you for inviting me.
I'm very honored to be sitting here with Congressman Hamilton and Professor Cate.
It's really a joy to be here.
As I've been reviewing all of the information that has been revealed by the documents,
there are three points that I want to make.
And they're all tied to core security principles.
The first is confidentiality.
And basically what confidentiality ensures is
that only authorized individuals are allowed to access information.
And so what we understand from the revelations in the documents is that that something
that the NSA hasn't really spent a lot of time on.
How do we specify effective confidentiality policies
and how do we then enforce those policies?
So given what Congressman Hamilton has said about, you know, they're going to continue
to collect massive amounts of information, that is a major concern.
Because how then do you protect access to this information?
The second, you know, and part of the revelations, in addition to all of the surveillance
that has been done, and all the data that has been collected,
there also has been a weakening of protocols.
There has been the implementation of backdoors into software programs.
And so we get to the next point, which is integrity.
Integrity says that there should not be some unauthorized change to information, to a system,
without that change being detected.
And because these backdoors and weakening of protocols have been put in place,
these same protocols are being used to protect our critical infrastructure.
You know, our power grid; you know, our electronic transactions, everything we are so dependent upon,
you know, our infrastructure, our Internet infrastructure.
And now the mechanisms that are there to secure and protect us, you know, have been compromised.
And so there's an issue with integrity.
The last is availability.
And so we've moved from confidentiality, the revelation of the information.
We've gone to understand that the integrity of our systems have been compromised.
And this is going to lead to these systems being under threat
and possibly not being available for us to use.
And so availability is the third core security principle which says that I expect
that a system will be available for use, you know?
And so because now we have more threats, you know, and there are more vulnerabilities in the system,
those backdoors, those vulnerabilities can be exploited by someone other than NSA.
And so that's going to -- that makes our systems more vulnerable to threat and attack.
And so I think I just kind of look at this from the perspective of a computer scientist,
especially one who specializes in security and, you know,
we see all these core principles being undermined.
>> So when we have a backdoor in software, it may not only be used
for what was thought to be its intended use.
>> Yes. And also, just because when you put a backdoor, you can't imagine that the fact
that a backdoor is there is going to always remain a secret or private.
Security by obscurity is not security at all.
>> Indeed.
Now I should tell our panelists, if you look at the audience and pretend you can see them,
they're wearing lanyards, a yellow or a blue lanyard of choice.
And I could say that the debate here is would you like more privacy or would you like more security.
And the answer, of course, is yes.
So the better way to frame the question is, are you more willing to give up security
to protect your privacy or are you more willing to give up privacy to protect your security?
The yellow cords are those who prefer --
they will give up some security because they want their privacy.
The blue lanyards are those who would give up some privacy to protect their security.
So we get a little of a realistic vote if we could see the audience.
Fred, I know that you have been deeply involved in many of these topics
and serve on several national commissions.
And I learned a lot in working with you this summer on some of these topics.
What is on your mind today in this area?
>> Well, first of all, thank you for having me back and thank you for letting me share the stage
with such distinguished colleagues.
I do miss the Cate is great signs.
I keep looking for them and already I feel my enthusiasm flagging in the absence of them.
But at least I don't have to look Shell is Swell signs.
And that's something to be grateful for.
I'm worried about everything, but to be brief about it,
I worry most here about the process and the trajectories.
So let me take them in reverse order.
The trajectory, as I think Congressman Hamilton's already noted, is only toward more surveillance.
We're going to have more data available, more granular data, more digital data.
And if the NSA can collect what it collects today by interpreting the law as it does
to say even though our focus as given by Congress is to collect data
for foreign intelligence purposes and only in connection with an authorized investigation,
but we interpret that to mean everything we can get our hands on about everybody.
As you well know, everything about everybody is only going to increase,
and I suspect increase exponentially.
And so if we don't deal with this issue today when we're talking about metadata,
when we're talking about surveillance of foreign phone calls, when we're talking about prism
and collection of email, we're not going to have dealt with it when we're talking
about biomedical information or transactional information
with financial credit cards and our bank accounts.
So the stakes are only going to get higher with this trajectory.
The other concern for me is the process.
And that is, we have pretty clearly documented that the Director of National Intelligence
and the head of the NSA and other senior officials have lied to Congress,
have lied to the American public, have told what the Director
of National Intelligence later characterized in an interview as the least untruth.
You know, I don't know if you were raising children,
would you feel comfortable with that notion?
I've taught them never to tell anything but the least untruth possible.
I don't think we would accept it.
I wouldn't accept from a student in a classroom, and I certainly don't accept it
from a senior government official.
And I think we need to worry that if we have seen the current set of oversight structures
and mechanisms such as the US Congress circumvented in this very aggressive way,
what happens in the future when we may have people acting either with less good will or acting
where they believe that a particular threat is more immediate.
And so I worry about the trajectory, but I also worry extensively about the process.
>> So one thing I hear from Congressman Hamilton through Raquel and Fred is a bit of a statement
of who will watch the watchers in this with the backdoor and oversight and such?
And so I'd like to ask the audience, how many
of you have some direct role in securing systems in some way?
In managing oversight security systems, networks.
So we've got a lot of hands going up.
So Fred and I were discussing this this summer.
And I want to walk through a scenario.
Fred, under the law, and Lee may want to comment on this as well, under the law today,
let's say one of our technicians is approached at Starbucks and someone says hello,
I'm from the NSA and starts to request things.
What does the law say that person can or cannot do?
>> Well, the law has actually been somewhat improved in this area in recent years.
Initially the view of the legislation as interpreted by the Bush Administration was one
of absolute secrecy so that if provided the person established their bona fides
and it would probably be an FBI agent, he would have a badge.
At that moment, if you were capable of producing information requested, you were required by law
to do so and to do so without consulting anybody else.
>> Yeah, I want to drive that point home.
That means if an FBI agent or NSA agent duly credentialed approached someone in the GlobalNOC
or someone in HelpNet or someone at Verizon -- It doesn't matter.
We're not special about this, anyone -- and said I want you to produce this information,
this log of information, disclose these things.
Your immediate thought would be, well, just a minute, I need to check with my supervisor.
So this is the evolution Fred's talking about.
Walk us through what the roles are and the law for that person.
Because this could happen to any of you.
And frankly, it may have happened to you.
And I would not know.
>> So, the way in which the law here got changed was a very courageous judge, Victor Marrero,
decided a case involving a national security letter.
And he said that type of classification, that type of gag order,
that's what we all call it, is unconstitutional.
And he really threw down the gauntlet to Congress because he said I'm going to strike
down the entire statue unless you deal with this issue.
So he therefore said I'm going to take the entire process by which this type
of information surveillance takes place by compulsory authority with an employee.
So Congress changed the law then.
And for the past four years the law has been that you can consult legal counsel as well
as anybody necessary to present the information.
And as a result IU has changed its policy.
And I think most companies and universities have done the same.
Anybody served with a piece of paper saying you have to produce something,
now shares that piece of paper with a general counsel.
>> Right. And again, to put a really fine point on it, and then I'll invite comment
from our other panelists, if you were approached by a government agent
and they said -- Dave Jent's here.
Dave runs the Associate Vice-President for Networks,
and said, Dave, I want to see these things.
So Dave may go well my guys do that, but I don't even have a password to login to those things.
So if Dave said I better go tell Brad about that, he is violating the law
and subject to penalty, is that correct?
>> That is correct.
>> Dave could go --
>> Unless you are necessary or unless you're counsel.
>> And no one -- they all know I don't have a password.
So Dave could go to one technician in GlobalNOC or HelpNet or the School
of Business wherever he would need to go and say here's a legal order.
We need to produce these things.
So the minimum technical people required to produce the information and Dave could go directly
to general counsel, Jackie Simmons.
Jackie cannot call me, is that correct?
>> That is correct, unless you are necessary to provide the information.
>> Can Jackie tell President McRobbie?
>> It's not at all clear under the law that that would be permitted
because he is presumably at his level not necessary.
I doubt if he has the passwords to all of these accounts.
And he's not a lawyer, so he's not part of legal counsel.
So the way the law currently reads, and certainly the way it is interpreted
by the Obama Administration is you tell your lawyer.
Your lawyer can consult other lawyers but cannot go outside of the legal chain other than to people
who are necessary to produce the data.
>> Yeah, and I think this situation speaks to what Lee was talking about in terms of lack
of oversight, Raquel talking about checks and balances in this situation.
Lee, how did we get to a situation where we've got a law, I suspect many of you are incredulous
about this as I was when I really looked into it and understood the role
of the FISA courts more on these matters.
How did we get to this situation?
>> You get to the situation you want to be in if those
in the official responsibility do the job they're supposed to do, which they haven't been doing.
The FISA courts have been extremely lenient.
The national security people have coopted the judges.
Now the judges are good people, they're good lawyers.
But they're not skilled in national security matters.
And when the government comes in and makes the argument, I've been there,
they just overwhelm the judges on national security grounds.
And the judge doesn't' want to step into this.
He doesn't know much about it, or she doesn't know much about it.
So as you know, in almost every single case, they have approved the request.
Okay, what has to happen?
They have to really grill these lawyers from the government.
You have to get a counter point of view, which you have not had in the FISA courts.
>> That's right.
If I may interject.
So the FISA court today, there is a government advocate for why this should be done.
>> And no adversary.
>> And no adversary saying no, this is inappropriate.
>> So the courts have to toughen up and they have to look at alternative ways of doing this.
And they have to be alert to protecting privacy and civil liberties.
And they have to ask the questions that a Constitutional lawyer would ask,
how do these powers of government impact the privacy and civil liberties of the people?
So that's number one.
Number two is the Congress itself.
The Congress needs to be much tougher and give much more scrutiny to all of these powers,
which they basically have not done for 10 years.
I don't see how you get a better balance between liberty and security
without a much more robust role by the Congress of the United States in setting
out the parameters, in setting out some constraints.
This news that we've all been captured with this week
about the Chancellor of Germany's phone being tapped.
Who gave the power to do that?
Well, we'll never find out.
I'm sure of that.
But that is a case where they have the power to do it, so they go ahead
and do it even though it violates every sense of decency in dealing with another country.
So the Congress has to step up.
The privacy and civil liberties board has to step up.
And all of these people have to try to assure that as the NSA and other government agencies try
to expand their power, become much more aggressive in exploiting that power, much more intrusive
into the lives of Americans, somebody is there saying hold on here.
Watch what you're doing.
What is the impact of this on the private citizen and on their liberties?
The administration will argue that we have the effective oversight.
And as Fred knows, he's working on one of the committees involved,
I'm willing grant that the administration does its best,
or has done its best to try to give some oversight.
But in our constitutional system, that is not sufficient.
The Executive Branch has its internal oversight.
But we have a government of separation of powers.
And that means power has to be checked.
And it has to be checked by the Legislative Branch.
It has to be checked by the Judicial Branch.
And if it is not checked, then it becomes a secret power and a dangerous power.
>> And to this point, Congressman, if the notes are correct, I had the staff pull together a bit
of a timeline of some of these events.
In 2007, August 3, 2007, Congress passed the Protect America Act, which expressly extended some
of the Patriot Act to involve obtaining foreign intelligence information with the assistance
of communication service providers and others outside of the US.
So there was an expansion, but I suspect we had no expansion
of oversight commensurate with that at that time.
>> You have very vague statues, Fred, you can support me on that I think.
>> That's right.
>> And the NSA is just run wild with it.
And the courts have accepted the interpretation.
Even the chief sponsor of the legislation, Jim Sensenbrenner,
Republican from Wisconsin, very able lawyer.
I've known him for many years.
He drafted the statute.
He says that the NSA has run wild with it and the courts themselves have said
that they have been misled in times past.
So yes, indeed, we have to crack down here on these powers.
>> So let's pivot to the personal privacy aspect.
And again, I'm going to be very local about this.
Universities and academic institutions, we've long had a history of openness and respect of members
of the community and some boundaries on things and policies and such.
And I'll lead with you Raquel, what are our reasonable expectations
of privacy within the university today?
>> Actually, before I address that question, I want to comment on the question that you asked
to Lee and to Fred regarding, you know, the abuse of the power to collect information and going
to an individual and requiring that they give the information and their bosses and anyone they,
in the reporting structure not knowing it.
While I'm listening to this, I'm thinking as a computer scientist, like okay,
this is a problem that needs to be solved because it's going to take Congress
and the courts a while to catch up and to address this.
>> That seems a fair assessment.
>> Yes. And so one of the things I'm thinking about is, okay, how do we specify policies
that everybody that is in that reporting chain can be aware of what's going on?
And that that person that has the access doesn't feel isolated.
Okay, one way to do that is specify a policy that says that if anyone needs this level of access
to this amount of information this can only be authorized or accessed by multiple individuals.
>> Some checks and balances.
>> Some checks and balances and to enforce that we can use something like threshold cryptography
where I need k out of some number n people in order to give me part
of the credentials that's needed to give you access to this information.
So in the meantime, while we're waiting on Congress and while we're waiting on the courts
to come up to speed, we can at least specify and enforce policies that make everyone more aware.
Now, back to the question that you just --
>> Yeah, the reasonable expectation to privacy in a university community.
>> In a university community.
What's the reasonable expectation of privacy?
And I -- am I speaking as an employee of a university?
[Laughter]
>> Yeah, you're speaking as a member of the academic community
at Harvard and at Indiana University.
>> You know, within an academic setting, I think I see the resources that are being provided to us,
you know, within the academic setting as something that will allow us
to do the work that, you know, we are there to do.
So I totally separate, you know, my personal and private life from,
you know, what I'm doing at the university.
>> So, should I be able to give your dean your times of login and logout
so he'll know if you're really working?
>> [Laughter]
>> Should I be able to read your email because I'm not really sure about you?
>> Well, you know what?
I think now that is a violation.
You know, if I'm -- if there is for some reason that you think that I'm doing something
where I'm exploiting things, if there is evidence that you had that I need to be investigated
for some reason because I'm, you know, maybe I'm selling university secrets
or I'm violating some copyright law, created some software and I'm trying
to do something that violates that copyright.
There's some reason for you to investigate me.
But just on a regular, you know, what I feel with regards to, you know, I'm being watched and --
if there's that lack of trust in my ability, then I think that maybe I should be dismissed.
This is what -- you know, so if there's that lack of trust that would cause you to have me
under surveillance constantly, no one wants to --
you can't work and thrive and create under a watched, in a watched state.
>> So it's an interesting situation we find ourselves in here.
Speaking just, you know, crassly by the law.
If you are an employee of General Motors, you're using General Motors' systems.
General Motors owns those systems.
And you know, what's your expectation of privacy as an employee
of General Motors using their systems to conduct email?
Zero. Zero, probably in the end, reality.
Universities, we think about that a bit differently.
So we have an entire regimen of policy.
Mark could probably quote the policy number if somebody believes that we need to go look
at Fred's email or your email or something,
there are a number of trigger events that have to happen.
Counsel would be involved before we would open that door.
But what's your expectation of privacy if you use Gmail?
Or Microsoft 365?
>> Right.
>> Lee.
>> I want to comment on your observation about the culture of openness
at the university, which think is correct.
And I'm very pleased that you have that culture.
But it is very, very contrary to the culture of the NSA.
>> Yes.
>> And to the national security community.
Now these are good people.
They're patriotic people.
They're able people.
But their set of mind is I know the threats to the United States Government.
I know the tools that we have to deal with those threats.
We have to do this in secret.
We cannot let it out into the public.
And we're not going to share secrets.
Trust us. We can protect your national security.
Now that argument is not totally bad.
There are secrets a government must keep.
>> And to some extent they are protecting us.
Let's don't miss that point as well.
>> Is there any doubt at all that this enormous mega-data vacuuming
of information is an important tool in fighting threats to the United States.
That's why I said I wouldn't abolish the program as some people will advocate.
It's not going to happen, but they'll advocate it.
But the culture of the university and the culture
of the national security community are totally different on this question.
And they really -- they wouldn't put it this bluntly.
I will. They really don't care about your privacy and your civil liberties.
They have an overriding responsibility and that's the protection of the American people.
That's a very important responsibility.
And by golly, they're going to do that job.
And if it means that they're collecting all sorts of data on Fred
and Raquel here, and on you Brad, tough.
We need it for the national security of the United States.
That's a clash.
And it's a very, very, difficult one to resolve.
What I am saying, basically, is that over the past 10 years we have not gotten the balance
between security and privacy right.
We've got to continue working on that.
And we've got to recognize there are merits on both sides here that we have
to do a much better job in getting the balance right.
And from my point of view at least, that means a little more respect for the privacy
and the civil liberties of the people and a little tougher attitude
on why do you need this information?
And is there a way you can get that information without being so intrusive
into the lives of people and institutions?
>> So I heard a quote on Sunday morning on one of the talk shows.
And I didn't have time to chase it down, it's proper attribution.
But the quote went something like never before have our lives been so transparent
to our government and at a time when our government has become increasingly opaque to us.
I thought that was an interesting comment and it is a very reasonable debate
about these matters that were going on.
Fred, Lee speaks to this big vacuuming up of information.
We know this massive data center is being built in Utah or is already online, I believe.
Could you speak to some of the things that are being vacuumed up?
>> Yeah, happily, although with great regret because if you think about almost every aspect
of our daily lives is now captured in some digital format.
And so we carry a Smart phone with us that's broadcasting our location.
We use keycards to swipe in and out of places.
We record all of our information digitally.
And all of that information is being captured someplace.
And therefore when the government comes in and says we want to get it,
they can get the granular detail, every purchase, everything you've looked at online,
every place you've driven in your car, every place you've taken your cell phone,
every email you've sent, every conversation you've had that's been captured in digital format.
In all of these settings we're giving up that type of data,
that type of granular information and so --
>> And now I see where we're giving up even more.
>> You don't mind if I record this for the big vacuum cleaner, do you?
>> If you're going to -- if feel like as long
as the playing field is level, we can go ahead and do this.
>> Oh, so as long as we're mutually armed, it's okay.
>> I think that's right.
It may be mutual assured destruction --
>> Okay. Lee, you're at a disadvantage here.
>> -- but at least we'll both be able to do it.
>> I think we're both at a disadvantage.
>> The key challenge here, I think, and I think the comments we've already touched
on have gotten there.
And that is it's not -- the data are going to be there.
We're fighting an impossible action if we say I don't want the data.
We love the technologies.
We love the convenience and efficiency.
And there are many places in which we love the data.
You know, I'm leaving for London.
I got an email this morning from the hotel saying we know you're coming.
We want to make sure we have your room ready.
What time are you checking in?
And that's because they linked all my data to know how many times I've been there before.
So what we need, and what in the university we have, are better rules that limit the use
of the data, that say it's understandable
that my data will be private as in the case of the university.
But if I'm out sick and somebody needs access to my teaching records, they're going to be able
to get it because there's a process that lets them get it.
It will involve other people.
It will involve oversight.
And you'll be able to determine who else had access.
That's what we don't have at the NSA.
>> So some form of technical safeguards with policy safeguards behind them.
And a bit to Raquel's point, you know, how would we manage those technical safeguard pieces in ways
that don't fall just prey to another part of Lee's thing at who's going
to watch the watchers when they have them.
So you know, we spent a lot of time in the spring and going into the fall,
many of you saw The New York Times was hacked, Bloomberg,
the Washington Post and The Wall Street Journal.
Four major cases of hacking all became public in about the January, February timeframe.
Mandiant, which is a security firm, did work for each of them.
And it published what it called Advanced Persistent Threat 1.
It's a detailed report.
It's public.
If you just Google Mandiant and APT1.
And the essential conclusion of this -- I mean, it's detailed down to the IP number, dates, times,
server logs and such -- its conclusion, whether you believe it or not,
is that much of these things that are going on, perhaps not directly,
everything with New York Times and such, but is state-engineered cyberwar.
State-engineered activity.
This is not, you know, some rogue guy in, you know, some country trying
to crack your bank account, though some of that may be going on.
To penetrate utility systems that may manage our water and things like that.
Or servers here at Indiana University.
And it expressly documented, as did The New York Times,
the compromises of security at universities.
Because of our tradition of openness, we're often used as a staging ground to then go
and attack the assets that they were really after.
So this is a time in terms of law, policy and cybersecurity itself that this time is different.
The scale of it -- we've always been concerned about security and securing systems and all.
Raquel is there anything on the horizon that you see
that may make our technological abilities to deal with these risks better?
Or are you a bit pessimistic?
Because as I read the other day, the computational cracking of the RSA algorithms
and such is, you know, moments away.
>> See, and that goes back to what I was saying about introducing vulnerabilities
into the systems and our ability to protect them.
I think that we definitely need to spend more time on looking
at the stronger protocols and creating stronger protocols.
But another thing that we should do is, you know, have heterogeneous systems.
The systems shouldn't all be the same.
We should not always be using the same components because if there's a vulnerability
in one component, it will not exist in the other if you are using different software,
you're running a different operating system.
>> But if I use Facebook for this thing and LinkedIn for something else and Google Plus
for this thing, but I integrate them and link them all together.
>> Oh, my goodness.
Don't do that.
Well, the thing is, you're talking about -- you're not necessarily --
you're talking about applications that you want to use, okay?
Like social media applications.
I think that, and I'll get on my sandbox just for a little bit here.
We share too much.
We over share.
And although we're concerned about our privacy, we're still sharing.
Although we say that the mechanisms are not in place for it to restrict access to the data
that we're sharing, we're still sharing.
Although we have high privacy concerns and we're very aware of the risk,
we still share because we're kind of addicted to the technology.
And it's providing us with something that we want to use.
It's just like getting a security warning constantly about the certificate
for this Website cannot be verified.
But because it has some information that we want, we do a click through.
And so, and you know, that becomes the issue.
So when you get into social media and you start telling me about linking, you know, don't link.
I'm just working on --
>> I'm not on Facebook.
>> Lee.
>> Brad, the vulnerability that you mentioned are real.
Our electrical systems, our communication systems, our financial systems are all very vulnerable.
Now the Congress began dealing several years ago with a cybersecurity bill.
They haven't been able to agree on it.
They've been able to agree on very few things and they didn't agree on that.
So the President puts out an Executive Order.
And people criticize the President's Executive Order, usurping power and so forth.
But he had to act, I think, in the national security interest.
But the sharing point is important here because a lot
of our vulnerability is not government owned but is privately owned.
The government has been asking the private sector to share information more
as to their vulnerabilities and as to their systems.
The private sector has been saying we don't want to share that information.
Not in every case but often, because we're protecting our proprietary interests.
And so without the sharing of information you cannot resolve these vulnerabilities.
>> Yes.
>> NSA, I've been critical of NSA, but NSA has remarkable technological knowhow.
I think superior probably to anyplace else in the country.
And they can help all kinds of institutions including universities
to protect themselves against vulnerabilities.
But it does require the sharing of information.
And that's not the easiest thing to get people to do for all kinds
of valid reasons, Raquel, in this country.
But the vulnerability is real.
These people in Washington that are very national security conscious are not dreaming things up.
>> No.
>> There are genuine threats to the security of the United States.
>> Well, as the Mandiant report documented.
And I think the most telling thing about all the documentation in here,
those who work in our security organization have known these things for a long time.
That this level of detail was made just generally public.
It tells you where they're at and what they know that is not
yet being made public along these lines.
One of the Tweets to the large vacuum cleaner in the sky asked the panelists,
have any of you worked in the defense industry or have top-secret clearance?
>> Yes.
>> Fred is a yes.
>> Have I worked?
>> Yeah. Worked in defense industry or held top-secret clearance.
>> Or - I'm sorry or?
>> Top-secret clearance.
>> Oh, yes.
I did.
>> Yes. And in Lee's capacity, certainly.
He has done so, yeah.
>> But now he's going to have to kill you, so we're sorry about that.
[ Laughter ]
>> After what I've said this morning, they'll consider me a security risk.
>> One of our audience asked, what does the panel think of NSA's methods to break
or to circumvent encryption standards?
Raquel?
>> You know, I think this is -- you know, this goes back to the question that you raised
to the audience about do I want to give up a little bit of my privacy to be more secure?
And I think the security has been, you know,
kind of a limit to the threat of a physical attack on the US.
But what we're not thinking about that conversation is not being made about the cyberwar.
And the fact, how devastating that can be.
If you look at Richard Clarke's book Cyberwar, he talks about if you're going to strike --
if you're going to be the first to strike in a cyberwar, you're going to be the one to win.
Because if they shut down all of our communications systems,
which every aspect of our life depends on, there's no way to attack them in a cyber way.
We have to go physical at that particular point.
And I don't think that we -- because this, you know, this cyberwar is not something
that we have felt, you know, before.
We haven't had our power grid like totally taken down, you know, across the US.
But that's what can happen.
We will feel it all across, you know, this country.
So I think there needs to be that conversation about what is the impact of actually a cyberwar?
We know what happens in a, you know, when there's a physical attack.
But with regards to the weakening of protocols to allow us to gain access to more information,
is that making us more secure when we look at the whole expanse of security and, you know,
not just the physical security of a bomb or a threat or an airplane going
into a building, but just our everyday livelihood.
You know, our food system, our water system, our power system, our banking system.
If you were to go to the bank and you can't get access to your money, what's going to happen?
You know? So I think that we need to begin to talk about, you know, these kind of things
because although I'm giving up a little bit of my privacy, does that actually make me more secure?
I know that they have prevented, you know, some -- they say about 50 threats.
But what happens when there's a cyber-attack and we feel it all across this country.
>> Raquel, I don't think there's any encryption the NSA cannot beat.
>> Yeah.
>> Now, that's my impression.
>> And I --
>> They may not be able to do it instantaneously.
It may take them a while, but they'll crack it.
>> But it's taking them less time every day.
>> And if the NSA can crack any encryption, someone else will be able to do it also.
>> Anyone who's been a victim of identity theft, professional criminal organized identity theft,
you know what a disruption to your life it is to go restart with your bank.
You see credit problems and credit reporting bureau matters for a long time
or fraudulently find that someone's created a driver's license for you in some other state.
And these things continue.
Encryption and cryptography really are the basics of what we trust going across a network.
And when we can no longer trust them, what happens next, Fred?
>> Well, I think first of all, you've highlighted a problem we've got with the NSA.
Because the NSA has these twin missions.
One is to secure our government's infrastructure.
The other is to penetrate other institutions' infrastructures.
And what we've always worried about is that they would use information about known vulnerabilities
and rather than fix them or disclose them, they would keep them so that they could use them
for their other mission to penetrate others.
What we didn't know is that they would be creating vulnerabilities of their own
so that they could use those to penetrate others.
So one issue here might be splitting out that mission
so that we actually have a federal agency thinking about cyber-security,
not using cyber vulnerabilities for other important missions.
I think a second point which has been highlighted across the panel is although the threats are
in many ways technological or exploit technologies, they are fundamentally human.
And if you look at every single successful institutional level threat,
it has involved a human vulnerability.
There is not a single purely brute force attack that has exfiltrated significant valuable data.
There is always, and it's usually a senior person in the hierarchy
because you don't go after a junior person.
They don't have access credentials you want.
But you get the president or you get the vice-president or you get a senior person to fall
for a phishing message, to otherwise compromise their system.
So again, we're going to need to focus with law, with economics, with training,
with other tools on that behavioral component or else we've lost this war already.
>> That's absolutely right.
It is a multi-pronged attack.
And just a few weeks ago the chairman or president of Mandiant received a phishing message.
They had cracked the limo service that he usually uses to pick him
up to the airport and take him back.
And it had a habit that after one of his rides he would receive a PDF file
of his invoice and his bill.
And this has worked this way for years.
And then he realized that he got a PDF one day that he did take a limo ride.
And it all looked like it came from the right company.
He had his people look.
The PDF actually had a payload of, you know, a virus and such in it.
So it is just such an escalating game.
We had an incident.
I won't go in detail, but across multiple higher ed institutions
where there was some spear phishing, particularly targeted at a very common type
of administrative system at universities that played out very quietly over a period of months
until there was actually monetary harm.
So to your point about how we share, I think all of you know Indiana University is the operator
for the Research and Education Network Information Sharing and Analysis Center, the RENISAC.
And that's one of those things within that trusted community we were able to get the word
out to other universities to watch for these particular kinds of things happening.
But engineering the human behavior so that some professor of law, for example,
doesn't just open a message and do something, that's one of our tough challenges.
We are in our closing moments here.
I'll ask each panelist.
This is a group of information technology professionals by and large.
Can you give them 30 seconds?
What's your parting advice to them?
And I'll start with Fred and we'll come back.
>> Well, I would say, particularly ending on the security point,
the key thing is to remember we are in a -- it is a process.
It is an evolutionary process.
And so we will never draw a line and say we've done it.
We've taken care of everything.
And it's keeping in mind that constant sense of evolution of threats,
evolution of tools, evolution of responses.
That's going to be the only way we stay up much less ever get ahead.
>> Raquel?
>> One think that I would say in my closing remarks is just to be vigilant with regards
to staying on top of what are the new advances?
What are the new threats?
And think of ways how you can diversify your infrastructure
so that you're not just dependent upon, you know, one system or one particular software stack,
one operating system, one application.
>> One cloud company.
>> Yeah, one cloud company.
But diversification of infrastructure, I think, is very important.
>> Congressman?
>> Well, I think everybody here has their own sphere of influence.
And I think it's -- and you're highly respected in what you do.
You're very knowledgeable about these things we've been talking about.
And I think from you the message has to be that we need
to get a better balance between security and privacy.
And in order to get that better balance, we need more oversight by the courts and by the Congress.
We need more transparency.
We need more public debate.
We need more information.
We need modest restraints or constraints on the programs we have.
And I hope that's the message that you can convey in your community.
I think we have a real threat to the way of life, if you will, we have in this country and have had
for hundreds of years because of this expansion of government power.
And I think we have to, not abolish it, but constrain it.
Keep it within Constitutional bounds and get that balance correct.
>> And it is our actions as citizens, as members of the academic community that shape
and reshape those values over time so they can be inherited by the next generation.
I will note in the Tweet stream, it was only the blue lanyards
who Tweeted to the vacuum cleaner in the sky.
The yellows chose to maintain their privacy.
[ Laughter]
Thank you for joining us for this morning panel.
We will take a 30 minute break.
We have refreshments in the foyer.
Our sponsors are there.
Please come back at 10:30.
We have an extraordinary session teed up with Aneesh Chopra and will be great.
I will see you at 10:30.
Thank you.
Thank you to our panelists.