America’s data centers are getting a lot more efficient
In the Berkeley Lab’s previous analysis, which was presented to Congress in 2008, it was found that energy usage by data centers was quadrupling every decade – an unsurprising figure given the explosive overall growth in the sector. Data centers in the U.S. consumed 70 billion kilowatt-hours in 2014, the researchers estimated.
+ALSO ON NETWORK WORLD: Windows 10’s biggest controversies + HPE's CTO is leaving amid more change at the company
Despite continued growth in the sales of servers and storage – the overall server install base is set to grow by 40% between 2010 and 2020 – electricity use has essentially plateaued in recent years, according to the latest analysis. Chiefly responsible for these efficiency increases are larger data centers operated by companies like Google, Facebook, Apple and so on, rather than smaller facilities.
There are three central reasons for these efficiency gains, according to the Berkeley Lab authors. First, instead of the simpler expedient of vast air conditioning systems to keep entire data centers cold, cooling systems have become a lot more efficient and selective.
“[A]dvanced cooling strategies, such as hot aisle isolation, economizers, and liquid cooling … all make the cooling process far less energy intensive,” said Arman Shehabi, one of the main authors of the report.
Servers designed to sharply reduce their energy consumption during off-peak moments are another key reason for the efficiency gains, as is the increasing prevalence of cloud and virtualization.