Tip:
Highlight text to annotate it
X
The cloud is certainly something new in the world.
But I think it has precedence.
Companies have always used technology.
They've always had to make decisions about what
technology to use and how to get access to it, and the
criteria they use in making those decisions have always
been very similar.
Because after all, companies are looking to maximize their
competitive advantage, maximize their profits,
maximize their flexibility.
So there are many things that don't change even as
technology changes, and changes dramatically.
So I think the best way to get an understanding of the
business implications of the cloud and where the cloud is
going in commercial terms is to look backwards in time.
And in particular, I'd like to go all the way back to about
1850, when a gentleman named Henry Burden built this
magnificent machine that you see up on the
screen right now.
Burden was a Scottish immigrant
to the United States.
He took over ownership of an iron working plant in the
industrial city of Troy, New York.
And what he knew was what all factory owners, all
manufacturers, knew at the time, which was that if you
were going to be in business, you also had to be in the
business of manufacturing power.
There was no way to keep your machines running, to keep your
employees employed in their work, if you didn't generate
all the power needed to run those machines.
But what Henry Burden knew I think better than many of its
competitors was that, if he could do a better job at
generating power, it would be more efficient, more flexible,
and so forth, he could get a big competitive advantage over
other iron working plants.
So he went out and built what was at the time the largest,
most powerful water wheel in the world.
And lo and behold, it did give him a
big competitive advantage.
His iron working plant became the main supplier of
horseshoes to the Union army in America's Civil War.
It became one of the main suppliers of spikes for
railroads as they built their lines out across the country.
But what I think is more interesting about this story
is what you would have seen if you went back to this same
site about 70 years later in 1920.
What you would have found is what you see here.
This great engineering achievement, and really one of
the most important engineering achievements in terms of
industrialization of the century, had
been allowed to collapse.
And here you see it rusting away in that field.
And you have to ask yourself, what happened?
Did the Burden iron works go out of business?
Not at all.
It was still a thriving company, a thriving
manufacturer.
What happened is a fundamental assumption about power
generation, which at the time was the most important
technology to business in the world, a fundamental
assumption had changed thanks to a brand new model of
supplying power.
A model pioneered by many people, but one of them was
George Westinghouse, who built the first big centralized
alternating current plant, which you see here building
off of Nikola Tesla's work.
All of a sudden, a fundamental assumption about business that
had been in place for hundreds of years literally--
that if you were going to be in business, you had to go out
and buy or build your own power generating station; you
had to hire people to maintain it; you had to sink a lot of
capital into it--
had gone away, because we had a new, more efficient, more
flexible, better model of supplying a resource that all
companies relied on.
And if you put yourself into the shoes of a factory owner
around 1900, the shift must have felt like an incredibly
radical, incredibly dangerous, and
incredibly risky leap forward.
Because at the time, power generation, that kind of
function, that kind of department, wasn't just about
building a big wheel beside your factory.
If you see here, this is a British manufacturing plant
around 1900 that made maritime products.
And if you look, the whole factory up above the machines
and down to the machines, is covered in the apparatus of
distributing power that was generated here from a local
proprietary electric generator.
So you have all these levers and pulleys and chains going
all through the factory.
This is kind of the ERP system of the day.
You know, the big SAP or Oracle installation that kind
of took over the entire plant.
I don't mean to be funny with that analogy.
It's literally very close to the truth.
You had to put a whole lot of assets inside your own plant.
You had to, as I said, hire a big staff to
keep everything running.
You had to update it.
You had to troubleshoot it.
You had to worry about breakdowns.
And yet this was the only possible way to get this very
important technology into your business until the centralized
utility system came along.
And what we saw in very short order was people, factory
owners, factory managers did make that leap of faith.
They did abandon the old model, closed down their
proprietary generating plants, dismantled all that gear and
apparatus, and in just the course of about 20 years as
you see here from 1910 to 1930, we had a massive shift
in the nature of the supply of power.
The utility model, which in 1910 centralized utilities,
produced about 40% of the electricity used throughout
the United States.
And these are US figures, but I think is very similar in the
UK and in Europe in general.
But only about 40% came from centralized utility plants,
and almost all of that was from small scale, direct
current plants type of things that Thomas Edison built.
That was used mainly to supply lighting to houses in cities
and offices in cities.
60% was coming from those private generating stations,
and almost all the factory power was coming from the
private generating stations.
Just 20 years later, utility supply was up to about 80%.
Those private plants had been reduced to only 20% of the
electricity produced.
And that trend would continue until it's about 5% private,
95% utility, where it has stayed until today.
So we had this enormous revolution, not just in
technology, but in business thinking and in business
assumptions about how this central resource, this key
resource, a resource that if you lost access to, you'd be
out of business very quickly, how quickly things changed.
And how quickly a new model, cheaper, more flexible, more
efficient, took over.
With cloud computing, I think, the same thing now is
beginning to happen to information technology, which
arguably is as important to businesses today as power was
to our industrial forebears 100 years ago.
And that's not to say that information technology, or
computing, and electricity are similar as technologies.
Obviously they're very, very different.
But what they are similar as is as business resources.
They're both general purpose technologies, kind of
platforms that you install in your business, and then build
all sorts of applications on top of.
And any time, what we know from history, not just from
electric power, but from all sorts of utilities and all
sorts of central supply of technologies, is that if you
can find a way to centralize the supply of a general
purpose technology and allow it to be shared rather than
forcing every company to invest its capital and its
assets into separate generating plants or any other
kind of these technologies, you can save enormous amounts
of money and free up enormous capital that companies can
invest in their main business, their core business in
innovation and so forth.
And if you look back through the history of IT in business,
I think you see a similar story that we saw with power
generation.
From the very first time that companies began to automate
data processing with punch card tabulators back around
1900, the assumption was you had to run your own machinery
inside your own company, license all your own software,
and that was certainly the model that prevailed as
digital computers began to come in to businesses around
the 1950s and 1960s.
And in fact, in some ways, the mainframe era was sort of a
Golden Age of IT, particularly if you look at it from a
purely efficiency standpoint.
All the complexity of computing was in essence
hidden inside of the mainframe computer.
So you had very clean data centers, very clean people
running the computers.
Very efficient model.
Capacity utilization was usually up around 80% or 90%,
but it had, of course, a big drawback.
And that was it was a very impersonal model of computing.
So you had these machines, but you could only really use them
for high level, very corporate types of processes, types of
data processing chores.
Individual workers, individual managers, couldn't put the
power of this computer to use in their day-to-day work, in
their day-to-day decision making because it was isolated
in essence from them and from the business decisions they
had to make.
So jump ahead 20 to 30 years, introduce the PC, introduce a
whole new model of computing, client server systems, and all
that complexity that had been hidden inside the mainframe
kind of explodes outward.
And suddenly you have a picture more like
this in data centers.
Incredible complexity, incredible cost, all of which
goes along with the extended use of computing for more and
more applications, more and more purposes.
So what we have by 1990 is kind of the mirror image of
the old mainframe era.
On the one hand, computing has become very personal.
Every office employee more or less has a computer
on his or her desk.
You can use it in many different ways to support
whatever that person happens to be doing, whatever that
person's job is.
On the other hand, we have enormous inefficiency come
into corporate computing.
So the high levels of capacity utilization of the mainframe
disappear, and disappear very quickly.
Because every company still has to build and maintain its
own data processing plant so to speak, you have high levels
of inefficiency built into the system because everybody is
replicating very similar investments in servers, in
storage gear, even in many of the kind of mainstream
business applications they have to license.
Moreover, they have to dedicate, in many cases,
hardware to particular software applications, and
they have to account for peaks in demand.
So basically, they're building huge amounts of overcapacity
into their systems. Which is more or less
where we are today.
And if you look at any kind of measure of the efficiency of
IT today, you see evidence of this lack of productivity,
this lack of productive use of capital assets.
So server capacity, there's a recent study of corporate data
centers by HP Labs I think, it said that the average
corporate server runs at maybe 20% of its potential capacity.
So exactly the opposite of what the old
mainframes ran at.
They ran at 80% percent capacity.
These only run at 20% capacity.
About 80% goes to waste.
Similar picture in network storage.
You have maybe 35% put to productive use,
65% going to waste.
And the biggest source of inefficiency today in private,
proprietary IT, comes into labor costs necessary to keep
the plant running.
Which as you know now, IT labor costs are the biggest
single cost in IT, often equal to hardware and
software costs combined.
And because every company still has to run what have
become extremely complex systems, they have to hire
lots of people.
These systems tend to be very manual in their upkeep, at
least until recently.
And what you have is about, according to most studies,
about 70% percent or so of labor costs, go to routine
types of maintenance activities.
In other words, the same kind of activities that all your
competitors are doing, nobody's getting any
competitive advantage or distinction out of --
unless they do them very poorly, in which they get
penalized heavily--
and yet these are costs that have to be assumed by
companies simply because there's been no other way but
to run all the stuff yourself.
I think finally now, we're coming to a new era of IT, and
what it's going to do, as I said before, is begin to free
up some of that capital that every company, and all
companies across entire economies, are
putting into IT.
If you look above, for instance, the individual
company level, you see a similar picture of, I think,
wasted capital.
This chart shows the percentage of the average
company's capital equipment budget that goes to IT.
Back in the mainframe age of the sixties and seventies, it
was less than 10% percent, or around 5% of your capital, was
going into IT.
As soon as you had the introduction of the PC and the
explosion of applications, that skyrocketed up to about
45% to 50% by the year 2000, which is the last year we have
hard figures on this.
And on the one hand, there's nothing wrong with this.
I mean, this trend testifies eloquently to the new uses and
new applications of IT that have been made possible by the
introduction of the PC and new models of applications.
On the other hand, if you think about it, if there was a
way to take what every company is investing in in terms of
equipment, in terms of labor, in terms of licenses, but is
not differentiating themselves from their competitors--
it's just a cost of doing business--
and you could figure out a way to share those assets, you
could drive down the total amount of capital that's
locked up in the IT plant and release that for much more
productive uses.
And that's really the promise of cloud computing.
The kind of new model of computing that I think is
still in its early stages of course, but is going to take
over more and more of corporate IT
in the years ahead.
This is a picture of one of Google's big data processing
plants, big data centers, this one in Oregon.
And I'm not using the picture of Google's plant to suck up
to our host, but just because I
particularly like this photo.
It kind of shows where the term cloud
computing comes from--
The steam rising off the cooling towers by the banks of
the Columbia River in Oregon.
But this really, whether it's Google or any other company
building these huge, very sophisticated cloud computing
plants, this really represents a new era in computing, a new
industrialization and centralization in a very
similar way that George Westinghouse's centralized AC
power production plant was.
And this is going to allow many forms of IT that
companies use to be supplied centrally, and to be supplied
in a much more efficient, much more I think in the end
reliable, and even secure manner than we've been used to
with our fragmented current model of IT supply.
If you look at statistics about the acceptance of cloud
computing at least as an idea, you can see that in just the
last year or two--
my book The Big Switch came out in the beginning of 2008,
and when I first started talking about it, companies'
main responses were, cloud computing, we'd never even
think about that.
We're not going to put our data in
somebody else's data center.
But we've really already seen a revolution in attitudes
about cloud computing in just the last two years.
So there was a study of international companies done
this past summer, and it found that 52% said they were
already using cloud computing.
Most of the others said that they were either in the
process of adopting it, or they were in
discussions about it.
And there are reasons to be suspicious about this data.
Another thing we've seen is that the definition of cloud
computing has kind of broadened out, so it includes
all sorts of things like web conferencing and stuff.
So I wouldn't say this indicates that big companies
are adopting cloud computing rapidly to replace their
internal data centers.
But what it does show is that the cloud has become accepted
as an option, as a new choice, for companies.
And I think that the use of it will only grow
in the years ahead.
Now, we can pause and ask why is this happening today?
We've heard about cloud computing for many years.
It used to be called other things.
Utility computing was back in the 60s in the mainframe age.
There were books published about utility computing and
how all information technology was going to be supplied from
centralized plants.
We had time sharing systems, kind of a prototype of the
cloud, but it never really took off.
And we've seen promises about cloud computing every decade
since then.
I think though where we are today is at a
very different place.
Where, because of technological advances,
suddenly this model is not only possible, but in more and
more cases, makes tremendous sense for companies to begin
to shift to it.
A good way to explain what's happened in recent years is
through two laws.
One of them is the famous Moore's Law, which I'm sure
you're all familiar with.
The power of computing at a particular price is going to
double about every 18 months or so.
That's the red line in this chart.
And ever since Gordon Moore announced his law, which
wasn't a law when he announced it, that's been playing out
since the 1960s, and that explains a lot about the
explosion of applications of IT, both
at home and in business.
Because whenever you take a general purpose technology and
you reduce its cost, you inevitably increase the uses
to which it's put.
It becomes economical to do new things with it.
And more recently, what Moore's Law has done is allow
a lot of the fundamental aspects of IT itself, things
that used to have to be embedded in hardware, servers,
storage drives, and so forth, to be turned into software.
Computing has become cheap enough that you can virtualize
all sorts of stuff that used to be hard capital assets.
And once you can do that, you can begin to have big,
centralized plants that are highly virtualized and
extraordinarily flexible in serving many different people,
doing many different things simultaneously with the
infrastructure.
But even as Moore's Law has been playing out, there's been
a big problem, and that is that there was no distribution
grid available to supply highly
sophisticated IT services.
Andy Grove, another Intel person, the CEO in the 1990s,
somewhere around 1995, he observed that the capacity of
data communications networks only doubled every century.
And he was kind of ticked off at this at the time.
He meant it as an insult to telephone companies because he
thought they were dragging their feet in modernizing
their networks.
But nevertheless, it gets across a fundamental point
about corporate computing up until recently, which was if
you want to tap into the new power and new applications
being opened up through Moore's Law, you had no choice
but to build it, and run it, and invest in it yourself,
because there was no distribution grid available
that could supply those services with the kind of
reliability and responsiveness you required.
But what we've seen over the last 10 years as we've had the
build out of the fiber optic broadband network, is that in
essence Grove's Law has been repealed.
And finally, the capacity of the network, the capacity of
the grid, is catching up to the speed, the useful speed
and power, of the computer.
And when that happens, it changes all of the trade offs
that we used to assume were in place with IT.
It means that even the most sophisticated IT services,
whether it's raw computing power, or storage capacity, or
applications, can suddenly be delivered from centralized
plants very efficiently, very effectively, from the cloud
over the new network.
Eric Schmidt, now obviously Google's CEO, but back in
1993, I think he was Sun Microsystems Chief Technology
Officer, I think he had what in many ways strikes me as the
simplest and best explanation still of what's going on, and
he predicted it however long ago that was, 16
years ago or something.
And he said when the network becomes as fast as the
processor, the computer hollows out and spreads across
the network.
That's exactly what cloud computing is doing.
We're moving from the World Wide Web where you had static
pages arrayed that you went out and looked at information,
to what might be called the worldwide computer, a computer
that all of us can tap into, whether we're at home or in
business or in our school, and basically get all of the power
that we used to rely on servers, or our own hard
drives to do and all the applications, we can get it
from this centralized, massive cloud or worldwide computer.
But beyond the technology, I think what really tells us
that this shift is happening and it's for real is that, if
you look on the consumer side of computing,
it's already happened.
10 years ago, if you wanted to do something new with your PC,
you'd go out and buy a new piece of software, bring it
home on a CD-ROM or a DVD, put it in your optical drive,
install it on your hard drive, make sure it was compatible
with your operating system and your other applications,
upgraded every couple of years.
You were the kind of amateur computer
technician in your own home.
Today, that assumption is going away, and for young
people, it's gone away completely.
Now, if you want to do something new with your PC,
your assumption is you fire up your browser, go out online,
you find the data and the applications you require to do
whatever it is you're looking to do.
So as consumers at home and certainly of students at
school, we've already moved to cloud computing.
It's a revolution that really has already happened.
In similar ways to what happened with the PC back
around 1980, consumers have made the choice about the
future of computing, and now we're kind of waiting for
businesses to catch up.
Its businesses that are following the trend rather
than leading the trend.
And that's not meant as any kind of insult.
If you're a business, you can't just go out and throw
all of your information up on Facebook.
You have to actually think a little carefully about what
you let go and how you let it go.
But nevertheless, businesses now are beginning to make this
same shift that consumers have already made.
And I would argue if you look across the IT business, and
not just at cloud suppliers like Google and Amazon Web
Service and Salesforce and so forth, but across even the
traditional IT players, what you see is all the innovation,
and most of the investment now, is going
into the cloud model.
We're getting billions and billions of dollars invested
in building now this new model by all sorts of companies.
And what that tells us is that all of the things that
companies have to think about-- control, reliability,
and security--
are improving, and improving at a very rapid pace.
And I would argue that ultimately, this centralized
system is going to exceed, by quite a large margin, on all
of those criteria what companies have been used to
with their private, proprietary, kind of
fragmented approach to computing that's predominated
in the past.
We're seeing a number of different adoption models
begin to emerge, particularly among larger companies.
We're still I think in many ways in the experimentation
phase when it comes to big
businesses and cloud computing.
One of the adoption models is using the cloud not so much as
a central supplier of services, but as a model for
how you build and operate your own data center, so moving to
a highly virtualized, new kind of infrastructure.
And this is particularly attractive to some very large
companies who have a lot of computing scale
in their own business.
In essence, they can turn their IT operation into an
internal utility that runs at levels of efficiency much
higher than used to be possible--
at least possible in last 20 to 30 years.
The second model is cloud as a supplement, which is already a
very popular model.
What the cloud allows you to do is if you need new IT
capabilities to launch a new unit or launch a new business,
the cloud makes that cheap.
It makes it easy.
You don't have to dump a lot of capital in.
You can get in and out very, very quickly.
And I think this is really a mainstream kind of model for
large companies as they begin to move into the cloud.
The third model is simply cloud as replacement, and
that's already a model that is very attractive to smaller
businesses who are happy not to have to go out and buy
their own servers, set up their own data centers, hire
their own staff and so forth.
And larger and larger companies over the next five
years, 10 years, are going to more and more take that model
seriously--
the cloud as a complete replacement for the internal
data center.
Then the last two models are cloud as democratizer.
Again, every kind of revolution in IT provides more
power to the individual, more computing power to the
individual.
So with the PC revolution, every person got a computer.
With cloud computing, every person gets
their own data center.
And we'll see almost certainly an explosion of innovation
based on that.
And finally, cloud as revolution.
Beyond the parameters of IT itself, as soon as you reduce
the cost and open the accessibility of computing,
you provide new models of business, new ways to think
about products, embedding networking into products and
services and so forth.
And again, we're just at the early stages of that model.
Let me end by trying to put what we're seeing in another
useful perspective, which is the perspective of disruptive
technology that the Harvard Business School professor
Clayton Christensen laid out a few years ago, and many of you
might have read his book, The Innovator's Dilemma.
What he noticed about disruptive technology is it
tends to take the same pattern over and over again, and I
think it's a pattern we're seeing
today with cloud computing.
So you have a kind of traditional
technology in place--
he called it a sustaining technology, which is the red
line on this chart--
and its performance is going up at a steady pace.
And it serves the needs of even very sophisticated users
of the technology, the high end of the market.
And then you have a disruptive technology come along, the
blue line here.
And at first it's very rudimentary.
It only fulfills very basic needs.
Maybe individuals, maybe sole proprietors begin to use it.
But its pace of performance and improvement is much, much
faster than the old, traditional technology.
And what happens inevitably is that more and more users come
to realize that they can use this new disruptive technology
in place of the old one and tends to be much cheaper, much
more flexible, and has other benefits.
And ultimately, the disruptive technology crosses the line of
performance demanded by even the most sophisticated user.
And at that point, we have a radical shift in the way the
technology is used.
The sustaining technology is abandoned.
Everybody moves to the disruptive technology.
The great challenge here, and the great danger, is that if
you're a company sitting on that red line, sitting on that
sustaining technology line, and you look down at the
disruptive technology, through this whole period, you can
easily dismiss it and say, oh, that's for smaller companies,
or less sophisticated companies, or that's what my
kids used to do this.
And you can be taken by surprise, and ultimately, you
become the kind of model as a company, as an organization,
that is disrupted and pushed to the side by this new model,
whether you're supplying the technology or using the
technology.
So I think the big question, the big challenge out there,
as we go through this fundamental change in the
nature of IT that every company has to deal with, is
whether you're going to be a disruptor,
or one of the disrupted.
And I think I can tell you one thing that I'm sure you know,
that playing the role of the disruptor is much more fun
than playing the other role.
So thank you very much.
Thanks.
[APPLAUSE]