AI requires energy and water. While firms are working on reducing their carbon footprint, users too need to be discerning in how they use AI
You sit in front of your computer and start posing questions to ChatGPT—a large language model-based AI (Artificial Intelligence) chatbot developed by OpenAI—about a work project. By the time you get your answers, and some solutions, your prompts and questions may have used more energy than a Google search query.
A few months ago, the Reddit group r/aipromptprogramming posed an interesting question to ChatGPT: How much energy does a single GPT query consume? The estimated energy consumption of a Google search query is 0.0003 kWh (1.08 kJ). The estimated energy consumption of a ChatGPT-4 query is 0.001-0.01 kWh (3.6-36 kJ), depending on the model size and number of tokens processed.
That means a single GPT query consumes 1,567%, or 15 times more energy than a Google search query. To put it in context, a 60W incandescent light bulb consumes 0.06kWh in an hour.
The research backs it: AI’s energy footprint is growing as more people use it, raising questions about its environmental impact.
The last two years have seen extensive AI adoption. OpenAI’s conversational ChatGPT chatbot set the ball rolling; now Google (Alphabet) and Microsoft have their own versions of chatbots, Bard and Bing Chat, respectively. According to a Reuters report, ChatGPT alone had more than 100 million monthly active users at the beginning of 2023. From creating AI-generated images of the “Balenciaga Pope” to Indianising tech billionaires, digital artists are using tools like Midjourney to push the boundaries of their imagination every day.
In a recent paper in the journal Joule, which looks at research, analysis and ideas on more sustainable energy, Alex de Vries, a PhD candidate at the VU Amsterdam School of Business and Economics, Netherlands, and the founder of Digiconomist, a research company, said that in a few years, powering AI could use as much electricity as a small country.
While it’s complex to calculate the exact environmental impact of, say, ChatGPT, there’s enough evidence pointing to an urgent need for sustainability.
Think water, electricity
Using AI requires not just electricity but water as well.
Let’s take a few steps back. AI chatbots or services like ChatGPT are designed to replicate human intelligence with the help of algorithms and deep learning. At the foundation of all this are large language models, or LLMs, which crunch huge amounts of information and data.
OpenAI’s large language models, including the models that power ChatGPT, are developed, or trained, with three primary sources of information: information publicly available on the internet, information licensed from third parties, and information provided by users or human trainers.
This happens in physical data centres where thousands and thousands of computers process the data, which requires huge amounts of electricity and energy.
In his paper, De Vries says that in 2021, Google’s total electricity consumption was 18.3 TWh (terawatt-hour, a unit of energy), with AI accounting for 10-15% of the total: “The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland (29.3 TWh per year), which is a significant increase compared to its historical AI-related energy consumption.”
According to estimates from the International Energy Agency, an autonomous intergovernmental organisation, data centres around the world consumed 220-330 TWh of electricity in 2021. In 2022, this figure was 240-340 TWh, or around 1-1.3% of global electricity demand. Most data centres still rely on grid electricity, sourced from fossil fuels, which contributes to greenhouse gas (GHG) emissions and rising global temperatures. Some estimates say data centres account for anywhere between 1-3% of energy-related GHG emissions globally.
In a recent interview with the UW office of news and information, Sajjad Moazeni, an assistant professor of electrical and computer engineering at the University of Washington, US, who studies networking for AI and machine learning supercomputing, explained how much energy large data centres use. “In terms of training a large language model, each processing unit can consume over 400 watts of power while operating. Typically, you need to consume a similar amount of power for cooling and power management as well,” Moazeni said. “Overall, this can lead to up to 10 gigawatt-hour (GWh) power consumption to train a single large language model like ChatGPT-3. This is on average roughly equivalent to the yearly electricity consumption of over 1,000 U.S. households.”
The Google Cloud data center ahead of its ceremonial opening in Hanau, Germany, on Friday, Oct. 6, 2023. Microsoft Corp., Alphabet Inc.’s Google and ChatGPT maker OpenAI use cloud computing that relies on thousands of chips inside servers in massive data centers across the globe to train AI algorithms called models, analyzing data to help them learn to perform tasks.
(Bloomberg)
Like our laptops and PCs, data centres also generate heat. While air cooling is used sometimes, many data centres also require huge amounts of water.
In 2021, Google’s global data centres consumed approximately 4.3 billion gallons of water. But, as an official blog post explains, water-cooled data centres use about 10% less energy and thus emit roughly 10% less carbon emissions than air-cooled data centres. Subsequently, in 2021, water cooling helped Google reduce the energy-related carbon footprint of its data centres by roughly 300,000 tons of CO2, making it what Google describes as “a climate-conscious approach to data centre cooling”.
In its 2022 Environmental Sustainability Report, Microsoft said its global water consumption went from 4.7 million cubic metre in 2021 to 6.4 million cubic metre in 2022. That’s nearly 1.7 billion gallons, or more than 2,500 Olympic-sized swimming pools. Outside researchers tied this increase to its AI research, an Associated Press report said.
“It’s fair to say the majority of the growth is due to AI”, including “its heavy investment in generative AI and partnership with OpenAI,” Shaolei Ren, a researcher at the University of California, Riverside who has been trying to calculate the environmental impact of generative AI products such as ChatGPT, told AP. In a paper due to be published later this year, Ren’s team estimates ChatGPT uses up around 0.5 litres of water every time you ask it a series of 5-50 prompts or questions, the report said.
A rendering of the analog IBM chip that promises greener AI.
(IBM)
Can AI become greener?
Some big tech companies are working towards solutions.
In August, IBM announced it had created a new chip—which emulates the human brain and the way our neural networks work—that promises greener AI. The analogue chip, according to a paper in the Nature Electronics journal, can handle natural-language AI tasks with an estimated 14 times greater energy efficiency.
Similarly, researchers at Northwestern University recently announced they had developed a nanoelectric device that could potentially make AI 100-fold more energy efficient. The device, which could be incorporated into wearables, can crunch large amounts of data and perform AI tasks in real time without relying on the cloud, and using less energy than current technologies. “With its tiny footprint, ultra-low power consumption and lack of lag time to receive analyses, the device is ideal for direct incorporation into wearable electronics (like smart watches and fitness trackers) for real-time data processing and near-instant diagnostics,” a news release from the university explains.
By 2030, Google is aiming to use carbon-free energy at its data centres. Like Google, Microsoft aims to be “water positive” by 2030, replenishing more water than it consumes across its global operations in water-stressed regions.
Big names like Hewlett Packard Enterprise and Amazon have also entered the AI cloud market. Research has shown computing in the cloud is more energy-efficient. A 2021 report by 451 Research, part of S&P Global Market Intelligence, found computing in the cloud was five times more energy-efficient than on-premises data centres in the Asia-Pacific region.
On-device AI, like users will see on Google’s Pixel 8 and Pixel 8 Pro smartphones, is expected to be a key turning point.
What can you do at an individual level? Use AI services more sensibly, say experts. In a recent article for the Harvard Business Review on how to make generative AI greener, Ajay Kumar, an associate professor of information systems and business analytics at the EMLYON Business School, France, and American writer and academic Thomas H. Davenport, asked users to be discerning in using generative AI. Machine learning can help predict disasters and be a great tool in fields like medicine, they write. “These are useful applications, but tools just for generating blog posts or creating amusing stories may not be the best use for these computation-heavy tools. They may be depleting the earth’s health more than they are helping its people.”
Also read: Can Formula One racing ever go carbon neutral?