Reality Bytes

|

  • Send by emailEmail
  • Comments
  • Printer-friendly versionPrint
Kelowna Gigacenter
Image by: Lionel Trudel

It’s a hazy summer day when I tour the former Western Star truck factory in Kelowna with Brian Fry, vice-president of RackForce Gigacenter. As we talk, Fry’s teenaged son and his girlfriend skip down to the other end of the cavernous 150,000-square-foot building, practically disappearing into the horizon. Ghosts of the old factory that once built 31 rigs a day still haunt the place; the signs pointing to emergency power shut-offs remain in place and the assembly lines are still marked in paint on the floor. Since Western Star moved to Portland in 2002, the building has sat as little more than a storage facility for boats and recreational vehicles, and apart from its sheer size there are few hints that this will soon be home to a computing revolution. But that is just what RackForce Networks Inc. and its partner, IBM Canada Ltd., are aiming for. By mid-2009 they plan to transform this old-economy relic into Canada’s largest data centre, the $75-million Gigacenter – filling it, row upon row, with computer server racks, cooling systems and the power systems to run it all.

Most of us don’t think about cyberspace in terms of geography – about an actual place where email folders live, websites are stored or where the reams of data we generate with every transaction ultimately pile up. We click “send” and stuff just flies into the ether. But as the world creates, dishes out and stores information in ever-increasing quantities, the question of where all that data resides has become critically important. IT research firm IDC estimates the world created a total of 281 exabytes – or 281 billion gigabytes – of data in 2007. By 2011 that figure is predicted to reach 1,800 exabytes, with Switzerland’s $6.4-billion Large Hadron Collider (the world’s biggest particle physics project, with 10,000 scientists from 100 countries collaborating to smash subatomic particles against each other at nearly the speed of light) expected to generate a stunning 300 exabytes on its own: more data than can fit on all the hard drives, DVDs and other storage media on the planet.

The most radical part of this information revolution, however, is the shift toward “cloud” computing. Decades ago most computer processing was done on big mainframe computers; as technology improved, machines got smaller, leading to the rise of the PC. Now, as communications technology improves to the point where broadband – and, more recently, wireless broadband networking – is becoming ubiquitous, much of the computing power is migrating back to centralized machines. Software, storage and processing muscle are moving away from desktop computers and into massive server facilities to handle the burgeoning demand for applications such as Google Docs, Web 2.0 sites such as YouTube and Facebook, and storage and sharing sites such as Flickr. YouTube alone streams 1,000 gigabytes of data – enough to fill the biggest hard drive you can buy at your local computer retailer – every second. Business applications sold as “software as a service” are a hot trend too. The proliferation of bandwidth has allowed users to outsource storage and computing power to these web-based companies, helping create a computing universe where users are better connected and yet can be more mobile. We now do many things on a low-powered hand-held device that we couldn’t do just a few years ago on the most powerful desktop PC because our gadgets can harness a galaxy of powerful, networked servers to do the heavy lifting. Applications and data crunching and storage no longer have to live on our devices but are instead scattered across the globe, accessible via networks, hovering in the amorphous “cloud.”

This cloud isn’t weightless, though, hence the worldwide race to build giant bricks-and-mortar “server farms” such as Gigacenter. Traditionally, racks of servers were found in the basement of office towers or in companies’ back closets, but the sheer electrical power these machines consume, and the cooling they require, are driving a migration toward more efficient server farms. “It’s much more efficient for a big operator to run these things,” says Jonathan Koomey, a project scientist at the Lawrence Berkeley National Laboratory in California. “These efficiencies can be quite substantial. You’re talking about factors of thousands.”

When it’s up and running, Gigacenter will have the capacity to draw as much as 40 megawatts of electricity, or enough to power more than 40,000 homes. That’s to process and store about 35,000 terabytes of data – a number that could grow as technology improves. Koomey’s research found that worldwide power consumption by computer servers doubled between 2000 and 2005. Data centres in the United States now consume 1.5 per cent of that country’s electricity, according to the U.S.’s Environmental Protection Agency, or about the same as all the television sets in that country combined; globally, data centres use about one per cent of the world’s electricity. So companies are scouring the planet for the optimal places to build their farms – locations where there is cheap land, few natural disasters and, most critically, bargain rates for power. In this environmentally conscious era, proximity to clean power sources – predominantly hydroelectricity, which is relatively cheap – is key.

Because of the industry’s unique power demands, a new wave of giant server farms is rising in places far from traditional IT hubs – most prominently along the banks of the Columbia River. With some 35 major dams harnessing the 2,000-kilometre-long river and its tributaries in Montana, Idaho, Washington, Oregon and B.C., the river is luring colonies of farms on both sides of the border with its promise of green hydro power. Google is building three data centres, 69,000 square feet apiece, in The Dalles, Oregon – only an eight-hour drive south from Kelowna’s Gigacenter. Microsoft Corp., Yahoo Inc., Intuit Inc. and other computing titans are turning the tiny fruit-growing town of Quincy, Washington – a five-hour drive south – into perhaps the most powerful computing centre on Earth. Microsoft’s facility alone measures 460,000 square feet, and the company is hoping to expand. Other central Washington towns, such as Moses Lake and Wenatchee, are home to even more server farms, including Ask.com’s.

Yet the cream of this Columbia River basin crop might just be Kelowna. “It doesn’t get any better than B.C. for having power capacity that is green and low cost,” says Rick Ellery, IBM’s team leader in the province. Ellery, a 40-something man with the wiry features and build of a triathlete, is IBM’s point man for Gigacenter. IBM is using its technical expertise to build the centre and its worldwide connections to bring in larger, enterprise-class clients (they’re hesitant to reveal names, but mention Ritchie Bros. Auctioneers, the world’s biggest industrial auctioneers, as one), while RackForce will handle most of the day-to-day operations.


Pages


Join us on LinkedIn Download on the App Store

@BCBusiness Twitter

bcbusiness
bcbusiness Vancouver real estate has always been pricey—here's the proof: https://t.co/qh00kgIvY1 https://t.co/hbSkdmi2R5
bcbusiness
bcbusiness RT @Resource_Works: #Kamloops at crossroads, with #copper and #gold mine in the balance. Our Facebook page: https://t.co/kkr69QewSE From @b