Greening the Web with Efficient Data Centers

(3BL Media/Justmeans) - Mostly, we think of the Internet, the cloud and the explosion of mobile technology as things that save us energy: letting our fingers do the walking, avoiding unnecessary trips, being more precise when we do go looking for something, and buying online. And for the most part, that is the case. Those UPS trucks running up and down your street, even though they are bigger than anything you’re driving (I would hope), they are running a specific route that passes near your house anyway. That makes them more efficient than your going to the store is likely to be. But that is not to say all that convenience comes without a cost.

In 2013, US data centers used 91 billion kWh of electricity. That’s enough to power NYC twice over. That number is expected to grow by half again by 2020. It’s become enough of a concern, that electricity providers have warned that they might not be able to keep up with demand, causing some data center operators to seek their own dedicated power sources, including some, like Apple, Microsoft and Google to use solar  or wind power for theirs.

The Internet has been estimated to contribute around 2% to the American carbon footprint. About half of that is for running the servers and the other half is for keeping them cool.

There are substantial opportunities to improve efficiency, many of which have been incorporated into the largest cloud server farms that are going up now. That makes sense for these huge farms, because for them, energy is going to be a considerable expense. However, these represent only about 5% of server energy use. Literally millions of smaller servers have been put up without much thought about energy, and that is where most of the problem is.

Some data center service providers have come up with innovative architecture that utilize natural cooling like Schneider Electric’s EcoBreeze, which has been estimated to reduce cooling costs by a third. Yahoo developed its own systems, optimized for cooling, emulating a chicken coop in its design. They also signed an agreement last year to get most of their power from wind.

Using renewables for power, and developing innovative cooling techniques do help, but they are in a sense, a bit like closing the barn door after the horse has escaped. The root cause is the server itself. A more efficient server uses less electricity and runs cooler. But even that doesn’t tell the whole story.

This series of articles, which I worked with Microsoft to put together, describes many opportunities to save energy at every step of a system design, starting at the silicon level, and moving through the operating system, the applications, the hardware, the server management infrastructure, to the building. System design plays a huge role, ensuring that the many elements are neither underused nor overused.

Given all the diversity of approaches, how is it possible to objectively evaluate a data center for efficiency? Fortunately, there is Open Data Center Alliance (ODCA) which has provided standard approach for measuring the carbon footprint of services provided from the cloud. When the group released their model, the made the following enterprise recommendations:

  • Select a “green” data center operator that sources power from a verified low-carbon energy sources.
  • Choose efficient data centers and cloud providers working together to lower PUE values and continuously moving toward enhanced energy efficiency. 
  • Require real-time carbon data from utilities (rather than annualized information received after the fact), as well as information on the sources of energy and how it is produced, to provide a more accurate picture to purchasers and consumers of the carbon footprint of cloud services.
  • Establish standards that provide a meaningful comparison of service providers. This approach includes consistently defining the capacity and workloads being purchased and how long these workloads are using the cloud service.
  • Minimize dedicated hardware use. Use shared servers and storage wherever possible, and consolidate applications in a finely tuned virtualized environment for the most efficient server use.
  • Develop “tight” code and minimize storage requirements. Reducing compute cycles and lowering storage needs minimizes energy use by as much as 40 percent, according to a study conducted by HP and the Rocky Mountain Institute.
  • Establish optimized business such as eBay’s Digital Service Efficiency methodology
  • Select equipment that is more efficient, giving more performance for the power consumed
  • Manage equipment more efficiently; for example, switching it off when it is not being actively used (from “always on” to “always available”).
  • Understand that embedded carbon is also necessary to obtain a complete picture of the carbon footprint of an enterprise.

Image credit: David Hoherd: Flickr Creative Commons