Contact Us
News

The Demand For Sustainability Has Trickled Down To Data Centers

Sustainability is no longer a fad in commercial real estate development — it is standard practice throughout the industry.

Even if a project is not seeking LEED status, incorporating sustainable practices can attract top tenants and save a fortune in long-term operating costs. For data center operators, those savings can be substantial, given how much electricity is needed to power servers and cooling systems.

Placeholder
A data center server bank.

Gensler Senior Associate and Critical Facilities Practice Leader Bernie Woytek said that his firm’s data center clients are anxious to be as efficient as they can with their systems.

The average electrical costs for most data centers can exceed $1M/month. Reducing that, either through operational efficiencies or incorporating renewable energy sources, can save thousands — even hundreds of thousands, in the case of larger centers — annually. By using more efficient cooling systems, coupled with using virtualization to ensure that servers are operating at close to capacity, data center users can boost their savings to seven figures.

Virtualization works similar to how RAM works on a computer. It puts multiple applications on single servers. As one application ramps down its usage, the server is smart enough to realize that it can add more load from another app. This allows a server to utilize power and space more efficiently.

Woytek said that virtualization goes back at least a decade.

“Now it’s the norm,” Woytek said.

Placeholder
A breakdown of four-tier application architecture in data centers.

ESD Consulting Senior Vice President Paul Schlattman said even though virtualization is standard practice, data center end users like cloud providers need more bandwidth, what with the explosion in mobile and Internet of Things-connected devices.

Cloud providers adopted four-tier application architecture, which allows for apps to scale with the change in demand from desktop to mobile to IoT.

ESD was selected by Underwriters Laboratories to develop UL 3223, a new international data center certification program that provides end-user transparency, provider accountability and proper data center documentation to further mitigate operational risk.

The Better The Energy Reduction, The Higher The Score

Placeholder

Another way to reduce energy usage in data centers is to pay attention to a center’s power usage effectiveness (PUE) — the ratio of total energy consumed by a data center to the energy delivered to computing equipment. Schlattman said that this became increasingly important after a 2010 report by Greenpeace revealed that the IT industry and cloud computing sector are a growing source of greenhouse gas emissions.

An ideal PUE is 1.

Woytek said that 15 to 20 years ago, the PUE of an average data center was around 2, meaning that for every kilowatt going to the servers, an equal amount was directed to cooling, lighting and other non-computing uses.

Now PUEs are in the 1.3 to 1.5 range, thanks to a variety of factors. More data centers are being built in cooler climates where operators can take advantage of the ambient air to cool server rooms. Energy-efficient lighting systems are gaining popularity, and today’s servers are designed to operate at higher temps. Woytek said that the air used to cool some server rooms today can reach as high as 82 degrees.

Schlattman said that the servers used by the top five data center users — Amazon, Google, Microsoft, Facebook and Apple — are designed to operate at temperatures higher than 68 degrees. ESD designs data centers for three of those companies, and the average PUE in those centers is 1.4.

Cool Runnings

Placeholder
QTS' Chicago data center

Water conservation is on the radars of data center operators and developers.

Schlattman said that ESD is designing more data centers with direct evaporation systems which, compared to air cooled chiller systems, can run dry for long periods of time. New Continuum Data Centers Chairman Eli Scher said that his firm has built wells on its data center sites for cooling water underground, and will implement them at a later date.

Despite all of these practices, data centers still consume a ton of electricity. Woytek said that renewable energy alternatives such as solar and wind energy are extremely inefficient, expensive and often too unreliable to incorporate into a data center site. The amount of land needed to generate power to operate a data center from solar panels makes that option cost-prohibitive. And windmills are irregular from a delivery standpoint, depending on ambient conditions.

Woytek said that fuel cell technology, which uses natural gas, has shown promise. Natural gas is cleaner than power plant generation, can produce electricity on-site and uses very little water. But scalability is still a factor. Schlattman said that ESD is designing many data centers with fuel cells in mind, but they have a capacity limit.

One option for data center operators to produce their own electricity is through the use of gas turbine generators. Schlattman said the prime source for electricity at one of his company's data centers in Glendale, Arizona, is using a gas turbine generator powered by methane gas from a nearby landfill, capable of producing 12.5 megawatts of power. But there still needs to be a redundancy in case the generator is down for maintenance or it malfunctions.

Learn more at Bisnow's Data Center Investment Conference and Expo, Sept. 28, in Chicago.