Welcome to the new Energy Central — same great community, now with a smoother experience. To login, use your Energy Central email and reset your password.

The Data Center Dilemma

Data Centers are energy-intensive. Because society is using more computing, and that will increase as more AI-based operations become part of our ICT ecosystem. Data center energy consumption is projected to triple globally by 2030, up from around 1000 TWh now to 2967 TWh in 2030.

This is a serious situation: firstly, this demand will need to be met from new power plants or greater efficiency, and undoubtedly will put a strain on the system. Secondly, there will be an increase in carbon emissions, when many countries have set targets for net zero, which may not be met under these circumstances. Companies which run data centers are becoming increasingly concerned about the sector's energy consumption, environmental sustainability and running costs.

 

What Can Be Done To Meet This Challenge?

The energy consumption ecosystem is complex and brings both external and internal pressures. Externally these include everything from increasing regulation, energy costs and security of supply to environmental, social and corporate governance (ESG). Internally there are operational pressures to improve energy efficiency in the face of surging demand. AI and High Performance Computing (HPC) mean greater loads on servers.

The regulatory landscape is changing too. In September 2023 the EU published its Energy Efficiency Directive (EED), which requires data center operators and owners to report energy performance for numerous metrics. Separately, the EU's Corporate Sustainability Reporting Directive (CSRD) needs large companies, including data centers, to report energy consumption, ESG performance, and carbon emissions, to regulators. Social pressures will also tend to push data center operators to adhere to sustainable and responsible business practices. So data center management will need to respond to these pressures.

Rising energy costs are ensuring the adoption of microgrids, wind turbines, energy storage, and other energy efficiency measures, such as using waste heat from the server farm to heat fruit and vegetable growing poly tunnels on real farms. New and advanced data center designs are leading the way in maximizing energy efficiency. Traditionally banks of servers are air-cooled, but more advanced designs are now looking to liquid-cooling which promises to be more efficient than air-cooling. AI can be deployed to optimize the demands on the system, for example, by predicting demand spikes.

As part of the energy network ecosystem, data centers can play an important role in load balancing. Demand Response (DR) could be a key load-management strategy for cost-effective load-balancing services and electric grid resiliency through the participation of data centers in electricity markets.

By aggregating data centers in Virtual Power Plants (VPPs), load balancing can function more effectively. TSOs and data centers are isolated from each other as the VPP acts as a mediator. This means that data centers are removed from direct communication with the external TSO. At the same time, the TSO does not need to monitor each data center, but is presented with a

single, controllable entity in the VPP. This does of course present a challenge, as the objective of data centers is to process data and serve their customers, whereas utilities supply and manage power, so there is a non-alignment in business objectives. However, with advanced systems, including AI, it appears that data centers can regulate their energy consumption without any loss in performance: for example, Google is using AI to manage “non-urgent” computing tasks at times when electricity demand is low, such as late at night.

Whether these efficiency and load balancing measures will be enough to offset the growth in consumption is not clear now, so utilities will need to watch the developing situation carefully.