By Kiran Gupta
In today's digital era, data centers have evolved into the lifelines of our interconnected world. They safeguard the mountains of data generated daily - from routine emails and social media updates to financial transactions and medical records. Nonetheless, these digital fortresses carry substantial environmental footprints, especially concerning their energy consumption and the resulting stress on our power grids.
To grasp the power consumption scale in data centers, we need to understand their functionalities. While all data centers fundamentally collect, process, and store data, their specific functions can vary. Some primarily store data using storage infrastructure, while others offer cloud computing services like AI and machine learning applications. Others deal exclusively with network traffic. Irrespective of their operations, all data centers house server racks, each brimming with high-powered computers requiring efficient cooling systems.
The International Energy Agency notes that data centers worldwide consumed approximately 205 million Megawatt-hours (MWh) of electricity in 2018. That's about 3% of the global electricity consumption. Unsurprisingly, as the demand for digital services soars and servers increasingly pack more processing power, so will this figure. For context, the average household in the US uses 1.2kW per hour. A decade ago, the power density for a standard server rack was between 5kW to 13kW per hour. Using NVIDIA’s future HGX A100 GPUs commonly used for Artificial Intelligence, one server rack can use over 160 kW per hour. That means a rack of servers that takes up space equal to the size of a household fridge can use as much power as 140 US households. A large data center can have over 12,000 racks. All these servers generate heat which requires data centers to be cooled. The cooling systems can add an additional 40% of a data center's total power consumption.
As predicted by Moore's Law—proposed in 1965 by Intel co-founder, Gordon Moore—the number of transistors on a microchip, and thus global computing capacity, doubles approximately every two years. With IDC projecting that global data usage will increase 5-fold from 33 Zetta Bytes in 2018 to 178 ZB by 2025, the need for more power-consuming data centers is inevitable. The surging demand for AI, machine learning, and autonomous vehicles further fuels this trend.
Data centers that house servers often require service-level agreements that ensure 99.999% or higher uptime. This means they can only be down for less than an hour for every decade of operation. Data centers that mine cryptocurrencies, however, seek the cheapest possible power. If power demand exceeds supply, they are often ok with turning off their miners when it means they can buy cheaper electricity. Electric utilities have to balance these differing demands.
Mirroring the rise of electric vehicles (EVs), this proliferation of data centers will stress power grids, particularly those already nearing capacity. It may even necessitate the construction of new power plants to meet the escalating demand. For power companies, data centers can pose considerable challenges given their need for constant power. Unfortunately, this constant draw makes certain renewable sources, such as solar power, non-viable due to the periodic power generation. Continuously producing energy sources such as geothermal are often better suited for data centers.
Data centers are increasingly becoming significant renewable energy consumers. Amazon and Microsoft led the pack as the largest corporate renewable energy buyers through Power Purchase Agreements (PPAs) in 2021. Such renewable energy purchases enable data centers to lower their carbon footprints and contribute to the shift towards a sustainable energy grid.
There are numerous strategies data centers can adopt to minimize their power consumption. Companies can select locations with colder climates to lessen energy spent on cooling systems. Intelligent power management using AI can predict traffic surges and optimize cooling systems accordingly. Some are running them at a higher temperature, with the US GSA saying data centers can save 4% of total energy for every 1℉ they allow the temperature to climb. Data centers could repurpose their waste heat for residential and commercial buildings, a practice already implemented by Microsoft's data centers in Finland.
In summary, the surge in computing and the escalating demand for digital services have substantially increased the global data center footprint. Despite the heavy electricity consumption and strain on our power grids, the industry is making headway in mitigating its environmental impact. It is achieving this through improved operational efficiency and a transition to renewable energy sources. As our reliance on digital services persists, it remains essential to continue advocating for sustainable practices within the data center industry.