This article originally appeared in issue 099 of ICON magazine. It’s about the architecture, aesthetics and perception of datacenters. Mentioned: Apple, Google, Facebook, Telehouse West (including interview with the architect, YRM), Andrew Blum, the SYNDC, William Gibson and more.
When Steve Jobs stepped on stage to thunderous applause at Apple’s Worldwide Developer Conference on June 6th, he announced a new product which everyone in the room had been expecting: the iCloud, a seamless syncing service for users’ data, documents, music and photographs between devices. But he also did something unexpected, something rarely done in the world of network technologies: he pulled back the curtain to reveal what “the cloud” really is.
“If you don’t think we’re serious about this”, said Jobs, referring to the iCloud product offering, “This is our third datacenter. It’s in Maiden, North Carolina. This is what it looks like.” The slide behind him showed a vast, windowless and ground-hugging white building, set in a deep forest, the size of several football fields and abutted by squat round cooling towers. “It’s a pretty large place,” he went on to say, “full of stuff. Very expensive stuff.” [1]
The idea of “the cloud” is almost as old as the internet; indeed, it is one conception of the internet, as a ubiquitous, pervasive network of access points and data services, of computation as a public utility. But the reality of the cloud, of the internet itself, is that it is a physical infrastructure of cables which run beneath streets and oceans, connecting exchanges and switches to servers in offices, homes—and datacenters.
The fragility of this network has been emphasised by recent events. In January, the late Egyptian regime effectively cut its country off from the internet with a few phone calls to the small number of licensed Internet Service Providers, which control virtually all the connections in and out of the country. In February 2008, a ship attempting to drop anchor at sea during bad weather in the Mediterranean accidentally sliced through the Flag Europe-Asia and Sea-Me-We 4 fibre optic cables which between them carry 75% of all traffic to the Middle East and South Asia, a region with over 75 million internet users. [2] In April of this year, a 75-year-old Georgian woman scavenging for copper to sell as scrap accidentally severed the main fibre link to neighbouring Armenia. Georgia supplies 90% of Armenia’s connectivity, and the so-called”spade-hacker” plunged 3.2 million Armenians, and a significant number of Georgians and Azerbaijanis, into data darkness for over five hours. [3]
Andrew Blum, a writer for Wired who is currently writing a book about the physical infrastructure of the internet, calls these physical points in the network “choke points”, geographical locations where the networks of networks connect to one another “through something as simple and tangible as a yellow-jacketed fiber-optic cable”. [4] Many of these networks meet one another in nominally neutral carrier hotels, and in more overtly privately owned datacenters like the one which Jobs so dramatically revealed.
One such carrier hotel is Terremarks’s ‘NAP of the Americas’ in Miami, Florida. Terremark is a multinational datacenter and network infrastructure provider, and the NAP, or network access point, provides meeting points for 160 networks, switches the majority of South America, Central America and the Caribbean’s digital traffic with the rest of the world, and hosts one of thirteen instances of the internet’s root domain name system (DNS), the critical technology translating human readable URLs into IP, the language of the internet itself. The NAP is also a highly secure 750,000 square foot fortress, with 7 inch thick steel reinforced concrete exterior panels designed to withstand a Category 5 hurricane and a 100-year storm. Situated in a downtown location, some seven storeys high and topped by golf-ball radomes, the NAP is not easy to hide, but like many of its kind, it is unmarked by corporate logos, and photography is strongly discouraged. [5]
Google, which recently bought another similar and nominally independent facility, 111 8th Avenue in New York, has spent the last decade building the single largest network of datacenters on Earth, nicknamed the Googleplex, estimated in 2008 to consist of 12 significant installations in the United States, with another three under construction, and at least five in Europe. [6] But Google is notoriously secretive about these locations, to the extent of obscuring its facilities on its own mapping and satellite viewing applications. In a Harper’s Magazine article on Google’s datacenter in The Dalles, Oregon, Ginger Strand wrote that the blueprints for such buildings, “are proof that the Web is no ethereal store of ideas, shimmering over our heads like the aurora borealis. It is a new heavy industry, an energy glutton that is only growing hungrier.” [7]
Datacenters have evolved dramatically from their origins as the boxy rooms housing large single mainframes at the dawn of the computing era. The modern datacenter incorporates back-up power generators, state-of-the-art inert gas fire suppression systems, multiple telecommunications network connections, rack upon rack of quietly humming servers, switches and modems, and the electrical and water supplies required to keep them running under optimal conditions with minimal human presence or interaction.
But the visible architecture of the datacenter, the envelope, has changed little: typical examples are nondescript office buildings with mirrored or shuttered windows, deliberately dull to the point of deflecting unsought attention; or vast, distripark-style groundscrapers of the kind unveiled by Jobs: the size of football fields, but marked with few clues as to their actual functions, the epitome of the big box.
Counterexamples are rare. In 2008, Swedish ISP Bahnhof opened ‘Pionen’, a data center located a hundred feet underground in a former nuclear bunker in the centre of Stockholm. Bahnhof deliberately styled the facility after James Bond films and 70s science fiction, with greenhouses, waterfalls and German submarine engines and klaxons, in order to stand out in a discreet industry: “The unique design makes it a ‘talk about’ facility,” said Bahnhof CEO Jon Karlung. “If you have been inside Pionen you will for sure tell somebody else about it.” [8]
Citigroup’s LEED-certified data center in Frankfurt by Arup announces its presence with a vast “green wall”, irrigated with recycled cooling water [9], while HSBC’s South Yorkshire National Data Centre just off Junction 36 of the M1, built by Midland Bank in the mid-1970s, was modelled after a supertanker, complete with a command-and-control bridge joining the two main computer rooms, and tall green cooling funnels erupting from well-tended lawns. [10, 11] Its rounded corners and white facades place it somewhere between the Bauhaus and Hubbard, Ford and Partners’ GRP-clad Mondial House on the Thames, from the same period: once Europe’s largest international telecommunications complex, its bold design expressing its technological function. [12]
But Pionen is accessed via a set of thick steel doors recessed into a cliff face, and the SYNDC is screened from the road by tall trees and fences, with inquisitive passers-by warned off by security—posters on local Sheffield messageboards refer to it as a conspiracy-laden Tellytubbyland. [13] Even these structures do their best to efface themselves.
The main reason for this, says YRM’s Iain McDonald, is security, on three fronts: terrrorism, industrial espionage, and theft. The legitimacy of the latter is confirmed by news reports, such as the break-ins at Level3’s Braham Street facility in March 2006, where thieves absconded with a valuable router and brought down a major London network in the process, or a burglary at Easynet’s Brick Lane datacenter in the same year, where equipment was worth an estimated £6 million was loaded into the back of a van and spirited away. [14] However, it’s hard to connect discretion with real security, when building owners and locations are easily discoverable on the web and in council records.
YRM has recently completed Telehouse West, a flagship facility at Telehouse’s data campus in Docklands, East London. Nine storeys high, with 19,000 square metres of technical and customer space, the building stands out from other datacenters, including the existing Telehouse and Global Switch facilities on the same site, and not just for its technical provisioning. [15, 16]
Telehouse West’s distinctive, windowless envelope incorporates a “disruptive pattern”, breaking up the monocolour facades with a series of tones based on a monochrome, silver-grey palette, resembling nothing so much as the pixelation of low-resolution imagery: the aesthetic of the network itself. Together with high-quality cladding, expressed cross-bracing and angled louvres which create a “crown” of visual interest, Telehouse West attempts to balance what McDonald calls “a Lloyds-type building which expresses its services” with “an aesthetic quality”.
McDonald cites William Gibson’s ‘Pattern Recognition’, a novel concerned with the human tendency to see patterns in meaningless data and the tensions between art and corporatisation, as an influence, as well as the way in which films like Blade Runner reset the urban landscape from a Logan’s Run-inspired modernism to a “dirty hybridity”. McDonald sees this era coming to an end as corporations seek to use architecture as branding, as at Mercedes-Benz World at Brooklands, an Aukett Fitzroy Robinson-designed scheme offering automotive consumers a range of experiences from galleries to circuit driving to retail.
“You could design a datacenter and, depending what you clad it in, you might be hard pushed to see it as that different from an art gallery.” New constructions like David Chipperfield’s Turner Contemporary in Margate are morphing into digital content institutes, sharing the datacenter’s challenges of managing complex internal requirements—lighting, atmosphere and temperature control—while projecting the appropriate brand values.
What is at stake is the way in which architects help to define and shape the image of the network to the general public. Datacenters are the outward embodiment of a huge range of public and private services, from banking to electronic voting, government bureaucracy to social networks. As such, they stand as a new form of civic architecture, at odds with their historical desire for anonymity.
Facebook’s largest facility is its new datacenter in Prineville, Oregon, tapping into the same cheap electricity which powers Google’s project in The Dalles. The social network of more than 600 million users is instantiated as a 307,000 square foot site currently employing over 1,000 construction workers—which will dwindle to just 35 jobs when operational. But in addition to the $110,000 a year Facebook has promised to local civic funds, and a franchise fee for power sold by the city, comes a new definition for datacenters and their workers, articulated by site manager Ken Patchett: “We’re the blue collar guys of the tech industry, and we’re really proud of that. This is a factory. It’s just a different kind of factory then you might be used to. It’s not a sawmill or a plywood mill, but it’s a factory nonetheless.” [17]
This sentiment is echoed in McDonald’s description of “a new age industrial architecture”, of cities re-industrialised rather than trying to become “cultural cities”, a modern Milan emphasising the value of engineering and the craft and “making” inherent in information technology and digital real estate.
The role of the architect in the new digital real estate is to work at different levels, in Macdonald’s words “from planning and building design right down to cultural integration with other activities.” The cloud, the network, the “new heavy industry”, is reshaping the physical landscape, from the reconfiguration of Lower Manhattan to provide low-latency access to the New York Stock Exchange, to the tangles of transatlantic fiber cables coming ashore at Widemouth Bay, an old smuggler’s haunt on the Cornish coast. A formerly stealth sector is coming out into the open, revealing a tension between historical discretion and corporate projection, and bringing with it the opportunity to define a new architectural vocabulary for the digitised world.
- [1] http://events.apple.com.edgesuite.net/11piubpwiqubf06/event/ ↑
- [2] http://www.guardian.co.uk/business/2008/feb/01/internationalpersonalfinancebusiness.internet ↑
- [3] http://www.guardian.co.uk/world/2011/apr/06/georgian-woman-cuts-web-access ↑
- [4] http://www.theatlantic.com/technology/archive/2011/01/tunisia-egypt-miami-the-importance-of-internet-choke-points/70415/ ↑
- [5] http://www.terremark.com/technology-platform/nap-of-the-americas.aspx ↑
- [6] http://www.nytimes.com/2006/06/14/technology/14search.html ↑
- [7] http://harpers.org/media/slideshow/annot/2008-03/index.html ↑
- [8] http://royal.pingdom.com/2008/11/14/the-worlds-most-super-designed-data-center-fit-for-a-james-bond-villain/ ↑
- [9] http://www.datacenterknowledge.com/archives/2009/04/24/citi-frankfurt-center-is-leed-platinum/ ↑
- [10] http://www.flickr.com/photos/benterrett/5544469736/ ↑
- [11] http://www.bing.com/maps/?v=2&cp=sxfbzfgw835v&lvl=18.068539402367243&dir=174.52816108477853&sty=b&eo=0&form=LMLTCC ↑
- [12] http://www.nothingtoseehere.net/2006/08/mondial_house_london.html ↑
- [13] http://www.sheffieldforum.co.uk/archive/index.php/t-37739.html ↑
- [14] http://russ.garrett.co.uk/2009/03/12/datacenter-security-a-cautionary-tale/ ↑
- [15] http://www.telehouse.net/telehouse-west/ ↑
- [16] http://www.flickr.com/search/?w=36624593@N00&q=telehouse ↑
- [17] http://www.ktvz.com/news/26827169/detail.html ↑
Photograph by Nick Rochowski / Telehouse. Used with permission.
Comments are closed. Feel free to email if you have something to say, or leave a trackback from your own site.