The Sound of Inevitability, Part 1
This is the first of a series of excerpts from The Executives Guide to Cloud Computing (Wiley 2010, available on hardcover, kindle), a book that I recently co-authored with Eric Marks. In particular, this series will focus on the reasons why the transition to cloud computing is simply inevitable. The excerpts themselves are slightly edited to better fit this format. Enjoy!
There have been very few fundamental changes in computing.
On the surface, that may sound like the statement of a madman, or perhaps at least someone from an alternate universe. Nonetheless, it is true.
Sure there have been, are, and will likely continue to be a nearly incomprehensible fire hose of particular changes, some rather flashy in and of themselves. Simple things like pocket-sized flash drives that store more than the corporate mainframes of 30 years ago, or perhaps ubiquitous mobile devices for everything from the mundanely practical—e-mail, calendars, and contacts—to the cheerfully sublime. Much more complex developments such as the open source movement; the advent of relational databases; and the rise (and fall) of whole operating systems and their surrounding ecosys- tems, even those whose perpetual dominance once seemed assured (how many desktop machines are running CP/M these days?)— These have come and gone, perhaps lingering in some niche, forgotten by all but a few fanatical devotees.
But truly fundamental change—the tectonic shift that literally changes our landscape—happens only once in a long while, perhaps every ten or more years, even in the computing business. Fundamental change of this magnitude requires a number of smaller innovations to pile up until a true nexus is reached, and we all start marching down a different road.
Of course, as historians are fond of lecturing the rest of us mere mortals, these sort of fundamental changes are nearly impossible to recognize while we are in the middle of them, even as they loom imminently.
When researchers at the University of Pennsylvania were feverishly working on ENIAC—generally recognized as the first program- mable, general-purpose electronic computer—as the future of the world hung in the balance in the midst of World War II, do you think they envisioned computers embedded in nearly everything, from greeting cards to automobiles, from microwaves to MRIs? When researchers at the University of California, Los Angeles, and elsewhere in the midst of the Cold War strove to make computer networks more resilient in the face of nuclear attack,1 do you think any of them envisioned the Internet as we see it today? Likewise, when Tim Berners-Lee and other researchers at CERN were trying to come up with an easy way to create and display content over this new, literally nuclear-grade network, do you think they envisioned the impact on everyday life (both personal and professional) their new creation would have, or even the simple breadth and depth of stuff—from the sublime to the silly—that would be available on this new, supercharged ‘‘Internet’’? One estimate is that there are more than 500 exabytes—that’s 500 billion gigabytes—in this ‘‘digital universe,’’ and that this will double every 18 months.
The simple truth is that very few, if any, of the people involved in these developments had much of an idea of the consequences of their creations, of the impact on our personal lives, our culture, even the society on which we live—from how we interact with our families to how we conduct business.
Whether you are ‘‘technologically modest,’’ or are either by age or temperament not ashamed to let it be known, at least in certain circles, that you are a bit of a geek . . . either way, it is pretty much a given that developments in computing are having a big impact on our society, and more to the point, an even bigger impact on how we conduct our business.
And bigger changes—tectonic-shift scale changes—will have at least commensurate impact on our lives in every dimension, including the fields of commerce. One example, perhaps a seemingly simple one, yet central to many of the changes now underway, will suffice to illustrate this point.
An Example for All to See
Consider for a moment newspapers. We now face the very real prospect—actually the near-certainty—of at least one (and probably many) major metropolitan area in the United States without a traditional (local, general purpose, print, widely circulated) newspaper. While this eventuality may be stayed—perhaps for quite some time—via government intervention, the fact that this will eventually occur is not in doubt. In a culture still echoing with such reporteresque icons as Clark Kent, or at least the more prosaic Bernstein and Woodward, this was once unthinkable. Now it is simply inevitable.
There was a time when the technology of newspapers—cheap newsprint (paper), high volume printing presses, delivery networks including everything from trucks to kids on bicycles—was the only reasonable means for mass distribution of information. In fact, with help from some of the newer technologies there was even a new national newspaper (USA Today) founded in the United States as late as 1982. But with the advent of alternative delivery channels—first radio, then broadcast, cable, and satellite television—increasing amounts of pressure were put on the newspapers.
The immediacy of the newer channels led to the widespread death of afternoon newspapers in most markets; anything delivered to the dinner table in a physical paper was hopelessly out of date with the evening news on television or radio. The morning papers had the advantage of broad coverage collected while most people slept, and as a result have held on longer.
However, at the same time intrinsic limitations of the newer technologies made them better for certain types of information, though not as useful for others. For example, a two-minute video from a war zone could convey the brutal reality of combat far more effectively than reams of newsprint, but did little to describe the complex strategic elements—political, economic, cultural—of the conflict itself. As a result, a certain stasis had been reached in which newspapers carved out what appeared to be a sustainable role in the delivery of news.
Then came the Internet.
In particular, the effectively free and ubiquitous—and yes, near- instantaneous—delivery of all sorts of information mortally wounded the newspaper business. As the first round of the web ecosystem grew, the only remaining stronghold of the traditional newspapers— their ad-based revenue model—was made largely irrelevant. eBay, Craigslist, and freecycle (among others) replaced the classifieds, and online ads took out most of what was left.
Some newspapers will undoubtedly manage the transition in some manner or another, perhaps even emerging as something fairly recognizable—particularly national/international properties such as the Wall Street Journal and the previously mentioned USA Today—and perhaps even financially sound.
But those that do will likely largely do so without their original distribution technologies, and more important, many will not make the transition at all.
What Happens Next
All of this upheaval in news delivery—the enormous changes that have already occurred and that which is yet to come—have been enabled by developments in computing technologies, with the widespread adoption of everything from the Internet to the iPhone. It is probably worth remembering that all of this has occurred largely without cloud computing, and as a result we are probably less than 10% of the way through this transition in news delivery, and this is only one industry. One industry, one example, with entire economies yet to transform.
Even so, some things have not changed much, even in the delivery of news. The computing infrastructures range from the stodgy (server, even mainframe- based systems within many newspapers) to circa-2009 state of the art (which we might as well start referring to as ‘‘legacy web,’’ web 2.0, old-school web, something like that). By and large these systems still cost too much to acquire, do not adapt to changes in demand nearly easily enough, are not reliable enough, and remain way too complex and costly to operate. Even the few systems that do not suffer from all of these problems are not ideal, to say the least: Some are proprietary, and most are either too complex to create new application software, or simply do not scale well enough, at least for the sort of software that researchers are hard at work developing. In particular, with the first generation of electronic news infrastructures focused on just delivering the news, the next generation will be focused on sifting through all of that content, looking for just the right stuff.
All of that sifting and sorting and searching will take orders of magnitude more computing capacity than we have anywhere today. How will we pay for hundreds and thousands, perhaps even tens of thousands times more servers and storage than we have today— almost unimaginable quantities of computing? How will we operate them? Write new software for them? It is fair to wonder how we will even power all that gear. Assuming that all of these concerns are resolved, then, we will face a larger question still, one which we presume has many answers: What sort of business models are enabled by all this, and how do we get there?
This Scarcely Seems Possible
Before we leave this example, it is probably worth considering our present circumstances just a bit more. In particular, most of the history of both economics and engineering can be understood by thinking about managing scarcity. In other words, how do I get the most done with the least stuff, or within certain limits? For example, that underlying drive to dealing with scarcity, at its core, drives the startup team to work harder and pay less, the Fortune 500 enterprise to optimize manufacturing processes, and entire nations to set energy policies. Allocating scarcity is just Economics 101. Of course, it is also Engineering 101. Dealing with scarcity causes communica- tions engineers to develop better video compression schemes, improve CPU designs to get more done in the same amount of time, and even rethink server packaging to reduce power consumption and labor costs.
While scarcity may be the nemesis of some, it is quite literally a prime mover behind the developments that have together come to be known as cloud computing. What does this mean, and how can it be possible?
Copyright © 2010 Eric A. Marks and Roberto R. Lozano.
In the next installment we’ll look at the underlying technological flow, and how that has made cloud computing possible. If you like what you’ve seen, keep in mind that the book is available (hardcover, kindle) now!