23
Source : http://www.google.com/about/datacenters / Compiled By : Sumantri Hadi Suseno [email protected] October 19, 20120 DATA CENTERS PHOTO ALBUM

2012 googles data centers

Embed Size (px)

DESCRIPTION

Google, which probably owns the largest number of computer servers in the world and a sizable chunk of the internet itself, has finally opened the doors of some its data centers to the curious, prying eyes of outsiders.

Citation preview

Page 1: 2012 googles data centers

Source : http://www.google.com/about/datacenters/

Compiled By : Sumantri Hadi [email protected]

October 19, 20120

DATA CENTERS PHOTO ALBUM

Page 2: 2012 googles data centers

Google, which probably owns the largest number of computer servers in the world and a sizable chunk of the internet itself, has finally opened the doors of some its data centers to the curious, prying eyes of outsiders.

Google operates a number of major data centers around the world, mostly in the US and Europe, which act as hubs for its smaller, regional installations.

• Council Bluffs, Iowa• Dalles, Oregon• St Ghislain , Belgium• Douglas County, Georgia• Hamina , Finland• Berkeley County• Mayes County, Pryor, Oklahoma

Page 3: 2012 googles data centers

Inside our campus network room, routers and switches allow our data centers to talk to each other. The fiber optic networks connecting our sites can run at speeds that are more than 200,000 times faster than a typical home Internet connection. The fiber cables run along the yellow cable trays near the ceiling.

Cou

ncil B

luff

s,

Iow

a

Page 4: 2012 googles data centers

Council Bluffs data center provides over 115,000 square feet of space. We make the best out of every inch, so you can use services like Search and YouTube in the most efficient way possible. A sustainable data center starts with making our computers use as little electricity as possible.

Cou

ncil B

luff

s,

Iow

a

Page 5: 2012 googles data centers

Hovering above the floor in Council Bluffs, Iowa, the scale of our data center there begins to take shape. Huge steel beams both support the structure and help distribute power

Cou

ncil B

luff

s,

Iow

a

Page 6: 2012 googles data centers

Plastic curtains hang in a network room inside our Council Bluffs data center. Here we serve up cold air through the floor, and the clear plastic barriers help keep the cold air in while keeping hot air out.

Cou

ncil B

luff

s,

Iow

a

Page 7: 2012 googles data centers

These colorful pipes are responsible for carrying water in and out of our Oregon data center. The blue pipes supply cold water and the red pipes return the warm water back to be cooled.

Th

e D

alles,

Ore

gon

Page 8: 2012 googles data centers

Our pipes aren't the only colorful things at our data centers. These cables are organized by their specific hue. On the floor, this can make things less technical: “Hand me a blue one."

Th

e D

alles,

Ore

gon

Page 9: 2012 googles data centers

As part of our commitment to keeping our users' data safe, we destroy all failed drives, on site. Hard disc sizes and models can vary. In this photo, you will notice some 2 TB drives.

St

Gh

isla

in ,

B

elg

ium

Page 10: 2012 googles data centers

Insulated pipes like these have a U-bend (called this due to their shape) so they can expand and contract as the fluid temperature inside the pipe changes.

Dou

gla

s C

ou

nty

, G

eorg

ia

Page 11: 2012 googles data centers

Thousands of feet of pipe line the inside of our data centers. We paint them bright colors not only because it's fun, but also to designate which one is which. The bright pink pipe in this photo transfers water from the row of chillers (the green units on the left) to a outside cooling tower. +On average, two gallons of water is consumed for every kilowatt-hour of electricity produced in the US. By using less electricity to power our computing infrastructure, we also save fresh water. Every year our efficient data centers save hundreds of millions of gallons of drinking water simply by consuming less electricity.

Dou

gla

s C

ou

nty

, G

eorg

ia

Page 12: 2012 googles data centers

These colorful pipes send and receive water for cooling our facility. Also pictured is a G-Bike, the vehicle of choice for team members to get around outside our data centers.

Dou

gla

s C

ou

nty

, G

eorg

ia

Page 13: 2012 googles data centers

We keep pipes like these ready with highly-pressurized water in case of a fire. This water, in particular, is cleaned and filtered so if we use it, we don't contaminate the facility. +

Dou

gla

s C

ou

nty

, G

eorg

ia

Page 14: 2012 googles data centers

Blue LEDs on this row of servers tell us everything is running smoothly. We use LEDs because they are energy efficient, long lasting and bright. + servers lose only a fraction of the electricity they pull from the wall during power conversion. By closely monitoring this process and keeping power efficient, we save over 500 kWh per server annually over a typical system.

Dou

gla

s C

ou

nty

, G

eorg

ia

Page 15: 2012 googles data centers

An overhead view of one of our cooling plants, where seawater from the Gulf of Finland entirely cools the data center there.

Ham

ina ,

Fin

lan

d

Page 16: 2012 googles data centers

Server floors like these require massive space and efficient power to run the full family of Google products for the world. Here in Hamina, Finland, we chose to renovate an old paper mill to take advantage of the building's infrastructure as well as its proximity to the Gulf of Finland's cooling waters. +

Ham

ina ,

Fin

lan

d

Page 17: 2012 googles data centers

These colorful pipes carry water. Three of our data centers, like this one in Finland, run on 100% unprocessed or grey water. The idea behind this is simple: instead of depending on clean, potable water, we use alternative sources of water and clean it just enough so it can be used for cooling. This water still needs to be processed, but treatment for data center use is much easier than cleaning it for drinking. +

With over 100,000 square feet of space in this data center, we had to create a numbering system so team members know where they are at all times. What would you say here? “1A.” This is the starting point of the grid underneath our cooling towers in Hamina, Finland.

Hamina , Finland

Page 18: 2012 googles data centers

These ethernet switches connect our facilities network. Thanks to them, we are able to communicate with and monitor our main controls for the cooling system in our data center.

Berkeley County

Page 19: 2012 googles data centers

In case anything should happen to our data, we have it all backed up. One of the places we back up information is here in our tape library. Robotic arms (visible at the end of the aisle) assist us in loading and unloading tapes when we need to access them.

Berkeley County

Page 20: 2012 googles data centers

This is a closer view of the backup tapes in our tape library. Each tape has a unique barcode so our robotic system can locate the right one.Unlike a real library, you can't check out anything, but if you try, we have a security team standing by.

Berk

ele

y C

ou

nty

Page 21: 2012 googles data centers

Storage tanks like these can hold up to 240,000 gallons (900,000 liters) of water at any given time. This insulated tank holds water that we'll send to the heart of the data center for cooling. +

Berk

ele

y C

ou

nty

Page 22: 2012 googles data centers

Each of our server racks has four switches, connected by a different colored cable. We keep these colors the same throughout our data center so we know which one to replace in case of failure. +Millions of feet of wire connect our servers together worldwide and make sure you can access Google services like Search and Gmail

Mayes C

ou

nty

, Pry

or,

Okl

ahom

a

Page 23: 2012 googles data centers

A rare look behind the server aisle. Here hundreds of fans funnel hot air from the server racks into a cooling unit to be recirculated. The green lights are the server status LEDs reflecting from the front of our servers. We optimize our servers and racks to use minimal fan power, and the fans are controlled to spin only as fast as necessary to keep the server temperature below a certain threshold.

Mayes C

ou

nty

, Pry

or,

Okl

ahom

a