Will Government Policy or Corporate Innovation Solve the Data Center Energy Vampire Problem?
AI Energy Vampires
Data centers operate 24/7, consuming vast amounts of electricity to power servers and maintain optimal temperatures through cooling systems. A recent Goldman Sachs report predicts data center power demands will grow by 160 percent by the end of the decade. A single AI-integrated search query, for example, is estimated to require up to 10 times the energy of a standard Google search—equivalent to keeping one light bulb on for 20 minutes.
The impact of AI-driven energy consumption will likely peak around 2028, accounting for over 60% of the overall energy consumption of data centers across the world. Due to excessive reliance on brown energy (fossil fuel-based), data centers significantly contribute to the greenhouse gas emissions resulting in high carbon footprints, and they are projected to contribute up to 5.5% of the world's carbon emissions by 2025, and by 2030.
So why does AI take so much energy? The training phase is one of the most resource-intensive steps. Large models are trained on GPUs, which require vast amounts of energy and significant water resources for cooling. Even after training, models often go through fine-tuning for specific tasks, further increasing energy consumption. Training GPT-3 generated as much CO2eq as 205 round trips between Paris and New York.
Meta and many other tech companies continue to face energy crunches thanks to their recent AI investments. Earlier this year, Microsoft confirmed its greenhouse gas emissions rose an estimated 29 percent since 2020 due to new data centers specifically “designed and optimized to support AI workloads.” Google has also calculated its own pollution generation has increased as much as 48 percent since 2019, largely because of data center energy needs.
Tech companies are also scrambling to reduce their footprint. Google has been claiming to be carbon neutral since 2007 and aims to run all its data centers on carbon-free energy by 2030. Apple claims to have achieved carbon neutrality for its global corporate operations and is working towards making its entire supply chain and product lifecycle carbon neutral by 2030. Microsoft has committed to becoming carbon negative by 2030 and aims to remove all the carbon it has emitted since its founding by 2050. With transition to renewables, improved energy efficiency with the use of AI, and software improvements, these companies could reduce total carbon emissions by 48%.
How to Improve Data Center Sustainability
Government Policies
Government policies can encourage markets to move towards reduced emissions.
A sustainable future requires systems-level change, strong government policies, and new technologies. - Google
European Union
Under Article 12 of the revised Energy Efficiency Directive of 2023, EU data centers with a capacity of over 500 kW must disclose their energy performance and sustainability metrics. Operators were required to submit key performance indicators to a European database by 15 September 2024, and subsequently by 15 May each year. The database will be publicly available and updated annually, reflecting data center energy use and energy efficiency at the EU and individual member state levels.
EU’s Joint Research Center has also developed a voluntary EU Code of Conduct for Data Centres, which compiles and encourages data centers to adopt a set of best practices, and issues awards to data centers that effectively use those methods to reduce energy consumption.
In 2020, the EU published a Green Public Procurement document on data centers, which outlines a voluntary set of criteria for environmental sustainability over the life cycle of a product.
Germany’s Energy Efficiency Act mandates the reuse of energy, including waste heat, on an increasing scale depending on when the data center opens—10% of energy for data centers that open on or after July 1, 2026, 15% for those that open on or after July 1, 2027, and 20% for those that open on or after July 1, 2028. All data centers were also required to cover 50% of their energy needs with unsubsidized electricity from renewable energy as of Jan. 1, 2024. That goes up to 100% after Jan. 1, 2027.
France has offered a reduced energy tax rate for data centers. To obtain the reduced rate, data centers must follow good management practices, including eco-design, energy efficiency optimizing, energy consumption monitoring and use of high-performance cooling technologies.
China
China has the most data centers in Asia, with a total of 449. It is in the midst of its ambitious “East Data, West Computing” project that aims to improve the interconnection and efficiency of China’s computing systems, including by building 10 national data center clusters. China hopes to support these with over 80% green electricity sources by 2025.
The government has launched the Special Action Plan for Green and Low-carbon Development of Data Centers to ensure sustainability and improve energy efficiency in data centers. It aims to optimize the layout of data centers and reduce the average Power Usage Effectiveness (PUE) to 1.5 or lower.
Netherlands
Starting January 1, 2024, the Netherlands has imposed temporary restrictions on the construction of new data centers. Hyperscale data centers will be prohibited nationwide, with exceptions at select locations.
Singapore
Singapore put a moratorium on new data center development in 2019. But it began relaxing this restriction in 2022, by allowing parties to propose projects that emphasize efficiency, sustainability, economic growth and international connectivity.
Now, the country aims to expand the capabilities of its existing data centers by increasing the energy available to them through sustainable power production and greater power efficiency.
To do this, Singapore’s Green Mark sustainability certification will raise its standard for energy efficiency by the end of the year, and introduce standards for IT equipment-specific energy efficiency and liquid cooling by next year.
Singapore's Infocomm Media Development Authority (IMDA) has introduced new standards under its Digital Sustainability Blueprint to gradually raise data center operating temperatures. This change could save 2-5% in cooling costs for each degree increase in temperature.
United States
The U.S. is farther behind in setting policies specifically focused on data centers. President Joe Biden’s administration is considering using the Defense Production Act to fast-track the construction of artificial intelligence data centers.
The Department of Energy published “Clean Energy Resources to Meet Data Center Electricity Demand.” They propose a number of tools to reduce energy use including:
Deploying clean generation and storage technologies
Leveraging existing nuclear and hydropower infrastructure
Redeveloping retired coal power plant sites
Enhancing and expanding grid infrastructure
Maximizing energy efficiency and demand resources
Proactive planning
Innovative tariff structures
Optimizing grid performance
Adopting alternative financing structures to fund new energy projects
Supply chain and workforce development
Proposed legislation in New York, “The New York State Sustainable Data Centers Act” would require data center operators to power their facilities with amounts of renewable energy that align with the state’s climate goals. It would require data center operators to submit annual reports on their water and energy use, as well as their sustainability efforts. It imposes limitations on the construction of new data center sites — operators would be mandated to report their projected water and energy use and hold at least two public hearings before construction begins. Under the proposed law, data center operators would also be responsible for subsidizing additional energy costs in the center’s surrounding community.
In Atlanta, Georgia, local officials have proposed a ban on new data center construction within city limits. The proposal arises from concerns over the impact on local infrastructure, power consumption, and the relatively low economic benefits these facilities provide compared to other types of commercial development.
Carbon Offsets
Carbon offsets have become a popular tool for big-tech companies aiming to achieve net-zero targets, but relying heavily on offsets can be problematic. Carbon offsetting involves compensating for emissions by investing in projects that reduce or remove carbon dioxide from the atmosphere. These projects can include reforestation, renewable energy development, and methane capture from landfills. The effectiveness of carbon offsets can vary widely, and there is often a lack of transparency and accountability in the offset market. Ensuring that offsets represent real, permanent, and additional carbon reductions is crucial, but verifying these qualities can be difficult. Furthermore, offsets do not address the root cause of emissions and can sometimes serve as a distraction from more direct mitigation efforts.
Carbon credits can also become expensive. A recent study estimates that carbon emissions for Amazon from 2030-2050 is 56% higher than the reported numbers in their sustainability report if carbon credits are removed from their actual carbon emissions. Amazon will need to invest more than $6-billion dollars to offset their carbon emissions just in 2030. Similar trends hold true for other big-tech companies.
Energy-Efficient Hardware and Software
Implementing energy-efficient components and optimizing software can reduce the overall power consumption of data centers, leading to lower energy costs and a smaller carbon footprint. In fact, 93% of the energy used in Data Centers support equipment operation and cooling.
Recent trends show that AI-based applications, such as training and inference, utilize only around 50% of the available GPU resources, due to bandwidth and communication bottlenecks. These bottlenecks lead to less energy-efficient operation and significantly increase the carbon cost of running AI workloads. With improved software, infrastructure and management techniques, researchers estimate GPU utilization to reach up to 90%, rapidly increasing efficiency.
Artificial Intelligence (AI) is viewed as a great optimization tool in the use of resources across various industries. By analyzing large datasets and identifying patterns, AI can help reduce waste, improve efficiency, and minimize the environmental footprint of production processes. Ironically enough, machine learning can be used to reduce energy consumption by an average of 11.17% daily, with peak savings of up to 15.56% during off-peak hours. So, maybe the question is if AI can offset its own energy use. Hey ChatGPT…
Quantum computing, although still in its early stages, holds potential for transforming telecom data centers. Recently interest has risen in quantum computing companies after Google made progress on the technology. Quantum computers can process massive amounts of data with significantly lower energy requirements than classical computers, reducing both power consumption and cooling needs. As quantum computing technology matures, data centers could leverage these systems for complex data processing tasks, further enhancing energy efficiency.
Another critical component of sustainable data centers is the strategic shift toward edge computing and distributed data center models. By processing data closer to where it is generated, edge computing reduces the need for large, centralized data centers that consume vast amounts of energy. Instead, smaller, localized data centers handle tasks, reducing latency and optimizing energy use. The distributed data center model also plays a role in reducing energy consumption. Unlike traditional models that rely on a few large facilities, distributed data centers operate through multiple smaller locations, each optimized for energy efficiency.
Virtualization technology has also become a key factor in reducing the environmental impact of data centers. By virtualizing physical servers, data centers can run multiple applications on a single physical machine. This allows for better resource utilization, as fewer physical servers are needed to support a growing number of applications. The reduction in hardware leads to less energy consumption, fewer cooling requirements, and ultimately, a smaller carbon footprint.
Renewable Energy Sources:
Bronis R. de Supinski, Chief Technology Officer (CTO) for Livermore Computing at Lawrence Livermore National Laboratory (LLNL) recently noted that increased energy efficiency allows us to run more and tackle bigger, more complex problems, which usually increases overall energy use, but the energy source is the real key to reducing the environmental impact of computing.
Using solar or wind power can help data centers reduce reliance on grid electricity generated from fossil fuels, decreasing carbon emissions. However, the intermittent nature of renewable energy requires substantial investment in energy storage solutions, such as batteries, to ensure a consistent power supply. Additionally, the geographic distribution of renewable energy resources may not align with the locations of data centers, necessitating further infrastructure development and investment.
To achieve this, several strategies are emerging such as the use of small modular nuclear reactors (SMR) to offer a stable, clean power supply, especially in isolated areas. Other players are investing in energy storage systems, such as high-capacity batteries, to manage fluctuations in consumption and guarantee increased energy resilience.
AI also plays a critical role in the energy sector, where it is used to optimize the operation of power plants, manage energy storage systems, and integrate renewable energy sources into the grid - so in a way AI gives back.
A Focus on Nuclear Power
Tech companies have announced multiple plans in recent months that hinge on nuclear power (Meta’s was supposedly halted because of a bumblebee).
Microsoft currently aims to bring the infamous Three Mile Island plant back online for its AI needs, while Amazon is funneling hundreds of millions of dollars into a partnership with Pennsylvania’s nuclear plant in Susquehanna.
Meta’s director of engineering for Generative AI, Sergey Edunov, stated two power plants would seem to be enough to power humanity’s AI needs for a year.
Google has signed a “world first” deal to buy energy from a fleet of mini nuclear reactors to generate the power needed for the rise in use of artificial intelligence. The US tech corporation has ordered six or seven small nuclear reactors (SMRs) from California’s Kairos Power, with the first due to be completed by 2030 and the remainder by 2035.
Amazon Web Services bought Talen Energy’s 960-megawatt data center campus in Pennsylvania powered by the adjacent nuclear plant, Susquehanna Steam Electric Station.
Improved Cooling Systems
Developing energy-efficient systems like liquid cooling or advanced air conditioning technologies can help maintain optimal temperatures in data centers while reducing energy consumption.
Cooling remains a critical factor, with increased attention being paid to systems such as direct liquid cooling (DLC), which makes it possible to improve the management of high-density servers. In addition to this, immersion cooling solutions are emerging. They can be used to dissipate even more heat by submerging the servers in non-conductive liquids to achieve even higher densities.
Water is essential in many data centers, especially for cooling purposes. Building AI models demands significant water usage for data centre cooling to prevent overheating. The growing usage of AI has increased the water footprint of major tech companies. For example, water consumption at Google's data centres has increased by 17% since 2023, due to the expansion of AI products and services. Similarly, the water usage effectiveness (WUE) of Meta grew from 0.24 in 2017 to 0.30 in 2020, although it has come down to 0.20 in 2022.
Sustainable water management practices can play a critical role in reducing the environmental impact of these facilities. One way to conserve water is through closed-loop cooling systems, which reuse the same water multiple times before needing replacement.
Data centers can also implement rainwater harvesting and other water reclamation systems to source water naturally rather than relying solely on municipal supplies. Additionally, using graywater recycling—where wastewater from sinks and showers is repurposed for cooling or irrigation—can reduce reliance on potable water.
Some data centers also incorporate advanced monitoring systems that track water usage in real- time, helping to identify leaks or inefficiencies quickly.
Geographical Location
Building data centers in cooler climates or locations with access to natural cooling resources, such as cold seawater, can help reduce energy requirements. The location of a data center can substantially impact its energy efficiency. Selecting a site near renewable energy sources, like solar farms or wind turbines, allows data centers to power their facilities sustainably.
Similarly, choosing cooler climates or areas with natural resources for cooling, such as nearby bodies of water, can further reduce the energy required for temperature regulation. For example, some data centers are strategically positioned near rivers or other sources of cold water, which can be used for cooling systems without the energy intensive processes typically required.
Data Center Design:
Implementing improved airflow management, insulation, and energy-efficient technologies can minimize energy consumption and environmental impact.
Modular data center designs use pre-fabricated components, allowing companies to construct centers more quickly and with significantly less waste. One of the primary benefits of modularity is its ability to reduce construction waste by minimizing the need for on-site building materials and simplifying the assembly process.
Beyond reducing waste, modular designs can optimize energy usage. With more compact footprints and streamlined cooling systems, these centers often require less energy to operate.
Selecting low-carbon building materials for data centers is another crucial step toward sustainability. Traditional building materials like steel and concrete contribute significantly to carbon emissions. In contrast, innovative alternatives like engineered wood, low-carbon concrete, and recycled steel can greatly reduce the carbon footprint associated with data center construction. Many companies are also opting for materials that can be recycled or reclaimed after their initial use, further minimizing their environmental impact.
What are the Tech Companies Doing?
Microsoft
Microsoft is building its first data centers made with superstrong ultra-lightweight wood in a bid to slash the use of steel and concrete, which are among the most significant sources of carbon emissions. The hybrid mass timber, steel and concrete construction model is estimated to significantly reduce the embodied carbon footprint of two new datacenters by 35 percent compared to conventional steel construction, and 65 percent compared to typical precast concrete. Contract language is being updated to accelerate decarbonization by including low-carbon requirements for materials and equipment used in datacenter construction. Select high-volume suppliers will be required to use 100 percent carbon-free electricity by 2030.
Microsoft has launched a groundbreaking datacenter design that consumes zero water for cooling, a significant step toward sustainability amid rising water stress. The innovative design employs chip-level cooling technology that circulates water in a closed loop, eliminating the need for evaporation-based cooling. New zero-water designs are being piloted in Phoenix, Arizona, and Mt. Pleasant, Wisconsin, with operations expected by 2026. Starting August 2024, all new datacenters will adopt this technology, with broader implementation set for late 2027.
Amazon
AWS has developed novel mechanical cooling solutions providing configurable liquid-to-chip cooling in both its new and existing data centers. Some AWS technologies utilize network and storage infrastructure that does not require liquid cooling, so updated cooling systems will seamlessly integrate air and liquid cooling capabilities for the most powerful AI chipsets, like AWS Trainium2 and rack-scale AI supercomputing solutions like NVIDIA GB200 NVL72, as well as AWS’s network switches and storage servers.
They also announced plans to update its data center infrastructure to support innovation in AI, power, cooling and hardware design geared toward increasing energy efficiency and decreasing the carbon footprint of its data centers. AWS said it would upgrade its cooling system to a more efficient one. The new system is estimated to cut mechanical energy consumption by up to 46% compared to an older design being used during peak cooling conditions.
AWS entered a multi-year partnership with Orbital Materials to boost data center sustainability and incorporate artificial intelligence to support related changes. Orbital, which uses a proprietary AI platform to develop climate technologies and new materials, will help AWS design, produce and test data center components that will allow for carbon removal, chip cooling and improved water use.
AWS is adopting practices to reduce the carbon stored in the concrete of the data center’s building structure by 35%, compared to an industry average. AWS aims to achieve this by using low-carbon steel and concrete, in addition to optimizing its structural design to rely on less steel overall. In October, Amazon announced it had invested in Paebbl, which looks to store carbon dioxide in building materials.
According to Google’s 2024 Environmental Report, they have taken the following actions to reduce their footprint:
Ten of their grid regions 10 achieved at least 90% carbon free energy, and a global average of 64% carbon free energy.
Built and launced first-of-a-kind enhanced geothermal project now delivering carbon free energy to the grid in Nevada.
Implemented a Google Renewable Energy Addendum that asks their largest hardware manufacturing suppliers, based on spend, to commit to achieving a 100% renewable energy match by 2029.
Launched a new, first-of-its kind clean energy partnership with NV Energy, a subsidiary of Berkshire Hathaway Energy to accelerate clean power.
Partnered with BlackRock’s Climate Infrastructure business that will help support the development of a 1 gigawatt (GW) pipeline of new solar capacity in Taiwan.
The internet is 24/7 – carbon-free energy should be too - Google
A Need for Standards and Transparency
While energy efficiency of the computing hardware and data center cooling solutions have increased over the years, the exponential growth in use of data center resources, which is driven by AI, continues to increase the overall energy demand. One of the critical issues in evaluating the carbon footprint of big tech companies is the lack of comprehensive and accurate data.
Missing data points can significantly hinder the ability to perform accurate sustainability analysis and generate reliable reports. For example, big-tech companies do not usually provide detailed information on energy consumption at different stages of supply chain of their manufacturing processes, emissions from third-party services, such as logistics and transportation, and the lifecycle emissions of products is often incomplete or unavailable. This lack of data can lead to underestimations or overestimations of the true carbon footprint, making it difficult to set realistic targets and track progress effectively. Standardized reporting frameworks and methodologies, such as the Greenhouse Gas Protocol, can help ensure consistency and comparability across companies.
A more comprehensive approach would include the carbon footprint of energy sources and the lifecycle impact of hardware manufacturing and disposal. This shift in metrics would allow us to address sustainability more holistically while still enabling growth in AI and other computing capabilities. AWS joined Meta, Google and other tech giants calling for increased transparency about the lifecycle emissions for data center infrastructure earlier this year.