Too long so I'll quote the highlights I thought were interesting.
Although the evergreen mazes, mountain majesties, and always-on skiing surely play a role, two amenities in particular make this the perfect site for a next-gen data center. One is a fiber-optic hub linked to Harbour Pointe, Washington, the coastal landing base of PC-1, a fiber-optic artery built to handle 640 Gbps that connects Asia to the US. A glassy extension cord snakes through all the town's major buildings, tapping into the greater Internet though NoaNet, a node of the experimental Internet2. The other attraction is The Dalles Dam and its 1.8‑gigawatt power station. The half-mile-long dam is a crucial source of cheap electrical power – once essential to aluminum smelting, now a strategic resource in the next phase in the digital revolution. Indeed, Google and other Silicon Valley titans are looking to the Columbia River to supply ceaseless cycles of electricity at about a fifth of what they would cost in the San Francisco Bay Area. Why? To feed the ravenous appetite of a new breed of computer.
Just last century – you remember it well, across the chasm of the crash – the PC was king. The mainframe was deposed and deceased. The desktop was the data center. Larry Page and Sergey Brin were nonprofit googoos babbling about searching their 150-gigabyte index of the Internet. When I wanted to electrify crowds with my uncanny sense of futurity, I would talk terascale (10 to the 12th power), describing a Web with an unimaginably enormous total of 15 terabytes of content.
Yawn. Today Google rules a total database of hundreds of petabytes, swelled every 24 hours by terabytes of Gmails, MySpace pages, and dancing-doggy videos – a relentless march of daily deltas, each larger than the whole Web of a decade ago. To make sense of it all, Page and Brin – with Microsoft, Yahoo, and Barry "QVC" Diller's Ask.com hot on their heels – are frantically taking the computer-on-a-chip and multiplying it, in massively parallel arrays, into a computer-on-a-planet.
The facility in The Dalles is only the latest and most advanced of about two dozen Google data centers, which stretch from Silicon Valley to Dublin. All told, it's a staggering collection of hardware, whose constituent servers number 450,000, according to the lowest estimate.
The extended Googleplex comprises an estimated 200 petabytes of hard disk storage – enough to copy the Net's entire sprawling cornucopia dozens of times – and four petabytes of RAM. To handle the current load of 100 million queries a day, its collective input-output bandwidth must be in the neighborhood of 3 petabits per second.
Wasting what is abundant to conserve what is scarce, the G-men have become the supreme entrepreneurs of the new millennium. However, past performance does not guarantee future returns. As large as the current Google database is, even bigger shocks are coming. An avalanche of digital video measured in exabytes (10 to the 18th power, or 1,000 petabytes) is hurtling down from the mountainsides of panicked Big Media and bubbling up from the YouTubian depths. The massively parallel, prodigally wasteful petascale computer has its work cut out for it.
Ask.com operations VP Dayne Sampson estimates that the five leading search companies together have some 2 million servers, each shedding 300 watts of heat annually, a total of 600 megawatts. These are linked to hard drives that dissipate perhaps another gigawatt. Fifty percent again as much power is required to cool this searing heat, for a total of 2.4 gigawatts. With a third of the incoming power already lost to the grid's inefficiencies, and half of what's left lost to power supplies, transformers, and converters, the total of electricity consumed by major search engines in 2006 approaches 5 gigawatts.
That's an impressive quantity of electricity. Five gigawatts is almost enough to power the Las Vegas metropolitan area – with all its hotels, casinos, restaurants, and convention centers – on the hottest day of the year. So the annual operation of the world's petascale search machines constitutes a Vegas-sized power sump. In the next year or so, it could add a dog-day Atlantic City. Air-conditioning will be the prime cost and conundrum of the petascale era. As energy analysts Peter Huber and Mark Mills projected in 1999, the planetary machine is on track to be consuming half of all the world's output of electricity by the end of this decade.
The struggle to find an adequate supply of electricity explains the curious emptiness that afflicts some 30 percent of Ask.com's square footage. Why is the second-fastest-growing search engine one-third empty? "We ran out of power before we ran out of space,"
Earlier this year, Sun presented new products that can dispense the entire Internet from a few bread boxes – using, curiously enough, industry-standard AMD Opteron processors, cheap hard disks, and industry-standard RAM. The Sun Fire X4600 is a modular hybrid data server and storage facility. Stacking 655 of these machines together, the Tokyo Institute of Technology created a 38-teraflop machine that has been recognized as one of the world's fastest supercomputers. And with 1-terabyte drives, available next year, Bechtolsheim will be able to pack the Net into three cabinets, consuming 200 kilowatts and occupying perhaps a tenth of a row at Ask.com. Replicating Google's 200 petabytes of hard drive capacity would take less than one data center row and consume less than 10 megawatts, about the typical annual usage of a US household.
Nuclear power plants just to run a single data center?