Much of the daily material of our lives is now dematerialized and outsourced to a far-flung, unseen network. The tilting CD tower gives way to the MP3-laden hard drive which itself yields to a service like Pandora, music that is always “there,” waiting to be heard.
“There” is nowadays likely to be increasingly large, powerful, energy-intensive, always-on and essentially out-of-sight data centers. These centers run enormously scaled software applications with millions of users. To appreciate the scope of this phenomenon, and its crushing demands on storage capacity, let me sketch just the iceberg’s tip of one average individual digital presence: my own. I have photos on Flickr (which is owned by Yahoo, so they reside in a Yahoo data center, probably the one in Wenatchee, Wash.); the Wikipedia entry about me dwells on a database in Tampa, Fla.; the video on YouTube of a talk I delivered at Google’s headquarters might dwell in any one of Google’s data centers, from The Dalles in Oregon to Lenoir, N.C.; my LinkedIn profile most likely sits in an Equinix-run data center in Elk Grove Village, Ill.; and my blog lives at Modwest’s headquarters in Missoula, Mont. If one of these sites happened to be down, I might have Twittered a complaint, my tweet paying a virtual visit to (most likely) NTT America’s data center in Sterling, Va. And in each of these cases, there would be at least one mirror data center somewhere else — the built-environment equivalent of an external hard drive, backing things up.
Small wonder that this vast, dispersed network of interdependent data systems has lately come to be referred to by an appropriately atmospheric — and vaporous — metaphor: the cloud. Trying to chart the cloud’s geography can be daunting, a task that is further complicated by security concerns. “It’s like ‘Fight Club,’ ” says Rich Miller, whose Web site, Data Center Knowledge, tracks the industry. “The first rule of data centers is: Don’t talk about data centers.”
INSIDE THE CLOUD
Microsoft’s data center in Tukwila, Wash., sits amid a nondescript sprawl of beige boxlike buildings.
After submitting to biometric hand scans in the lobby and passing through a sensor-laden multidoor man trap, Manos and I entered a bright, white room filled with librarylike rows of hulking, black racks of servers — the dedicated hardware that drives the Internet. Like most data centers, Tukwila comprises a sprawling array of servers, load balancers, routers, fire walls, tape-backup libraries and database machines, all resting on a raised floor of removable white tiles, beneath which run neatly arrayed bundles of power cabling.
Tukwila is less a building than a machine for computing. “You look at a typical building,” Manos explained, “and the mechanical and electrical infrastructure is probably below 10 percent of the upfront costs. Whereas here it’s 82 percent of the costs.”
THE RISE OF THE MEGA-DATA CENTER
Data centers were not always unmarked, unassuming and highly restricted places. In the 1960s, in fact, huge I.B.M. mainframe computers commanded pride of place in corporate headquarters. “It was called the glasshouse,” says Kenneth Brill, founder of the Uptime Institute, a data-center research and consulting group. “It was located near the executive suite. Here you’d spent $15 to 30 million on this thing — the executives wanted to show it off.”
Over the past few decades, Brill notes, there has been an oscillation between using centralized I.T. resources and using more dispersed computing power — a battle between mainframes and desktop computers. The latest iteration is what’s called the thin client: the use of centralized servers rather than the software and operating systems of desktops to perform the bulk of computing needs. But thinness in the office has come with increased thickness elsewhere: more servers in ever-larger data centers.
In his book “The Big Switch,” Nicholas Carr draws an analogy between the rise of mega-data centers and the Industrial Revolution. Just as nascent industries, once powered by water wheels, were by the 20th century able to “run their machines with electric current generated in distant power plants,” advances in technology and transmission speeds are permitting computing to function like a utility, a distant but ever-accessible cloud of services.
This has sweeping implications for business and society. Instead of buying software and hiring I.T. employees, companies can outsource things like customer relationship management, or C.R.M., the database software that companies use to track client interactions, to an Internet company like salesforce.com, which sells subscriptions, or seats, to its services. “Customers who have two seats on salesforce.com, like a mom-and-pop flower shop, have access to the same application as a customer that has 65,000 seats, like Starbucks or Dell,” Adam Gross, vice president of platform marketing with salesforce.com, told me at the company’s offices in San Francisco. By contrast, just a few years ago, he went on, “if you were to attack a really large problem, like delivering a C.R.M. application to 50,000 companies, or serving every single song ever, it really sort of felt outside your domain unless you were one of the largest companies in the world. There are these architectures now available for anybody to really attack these massive-scale kinds of problems.”
THE ANNIHILATION OF SPACE BY TIME
... It seemed heretical to think of Karl Marx. But looking at the roomful of computers running automated trading models that themselves scan custom-formatted machine-readable financial news stories to help make decisions, you didn’t have to be a Marxist to appreciate his observation that industry will strive to “produce machines by means of machines” — as well as his prediction that the “more developed the capital,” the more it would seek the “annihilation of space by time.”