The Alarming Way Big Tech Wants to Power AI Chatbots

Photo Illustration by Erin O’Flynn/The Daily Beast/Getty Images
Photo Illustration by Erin O’Flynn/The Daily Beast/Getty Images

Think about how much you use electricity each day: every time you flip a light switch, use the microwave to heat-up dinner, boot up your laptop for work, or power on your TV at night to watch some Netflix before bed. It’s a lot—and you’d probably lose track after a while. At the end of the year, though, the average U.S. household will use roughly 10.6 MWh of electricity.

That’s a lot of energy that goes into powering our day-to-day lives. But if you were, say, a Big Tech company operating one of the most powerful and popular chatbots in the world, you’re going to use far more energy than that. That’s because these models require a lot of energy-intensive hardware like GPUs, which are the chips and circuits that are used to actually train and perform the computations necessary for AI. These units are incredibly powerful—but also require a lot of power to operate.

While artificial intelligence companies like OpenAI and Google are highly secretive about how much energy their models are gobbling up each day, some researchers estimate that just the training of ChatGPT’s underlying model GPT-3 required 1,287 MWh of energy. That’s the equivalent of what a little more than 120 U.S. households use in an entire year just to create the model. One researcher from the University of Washington estimated that queries in ChatGPT add up to about 1,000 MWh each day—which amounts to the energy usage of nearly 34,000 households each year.

The Social Media Panicmongers Have Pivoted to AI

This energy usage doesn’t come without its costs—both monetarily and otherwise. In a paper published in the Journal of Machine Learning Research in June 2023, the authors estimated that training GPT-3 resulted in roughly 500 metric tons of carbon emissions—or the equivalent of driving a million miles by car. To keep it running each day for every single query, results in exponentially more emissions.

That’s arguably a very small portion of the impact too.

“It’s not so much just about energy consumption,” co-author Sasha Luccioni, the climate lead at Hugging Face and founding member of Climate Change AI, told The Daily Beast. “It’s also the size of the model, how long it took to train, and what kind of hardware was used.”

Perhaps that’s why it’s not too surprising that the likes of OpenAI and Microsoft have begun to look into new ways to power their current and future AI projects. However, the precise method they’re exploring might raise a few eyebrows: nuclear power. This week, Microsoft posted a job listing for a principal program manager of nuclear technology who will “lead project initiatives for all aspects of nuclear energy infrastructure” that its AI models and cloud infrastructure will reside on.

Specifically, the project manager will be in charge of implementing a framework for small modular reactors (SMRs). These are nuclear reactors that are designed to be a fraction of the size of typical reactors. In theory, this will allow them to be easier and cheaper to build and manage—all while creating a relatively cleaner form of energy. (While the U.S. Nuclear Regulatory Commission approved an SMR design earlier this year, the technology has yet to be actually deployed in the field.)

How We Might Finally Get Over Our Fear of Nuclear Power

Microsoft isn’t the only AI company doing so either. OpenAI CEO Sam Altman has a few irons in the proverbial fire when it comes to alternative energy, including Oklo, a nuclear energy startup he took public for $500 billion in July 2023, and Helion Energy, a nuclear fusion startup he’s invested $375 million in. Microsoft also inked a deal to start purchasing energy from the latter company in 2028.

It’s still too early to tell how these investments will ultimately shake out, but it’s clear that these companies are signaling to the world that they know how much energy and resources their AI models need to run. To make sure they’re best-positioned in the future, then, they’re starting to make big bets on alternative forms of energy—even if they might be a little controversial, as is the case with nuclear power.

“Nuclear energy is really debatable in terms of sustainability,” Luccioni explained. “Nuclear is not renewable. There’s also a whole big can of worms when it comes to dealing with nuclear waste. If you really want to be sustainable or beneficial for the planet, you can heavily invest into solar or wind.”

That isn’t to say that there aren’t undeniable benefits when it comes to nuclear energy. However, when coupled with the thought of private companies having powerful reactors at their disposal, it does become very alarming very quickly.

An AI Chatbot Solved Analogy Problems Better Than Humans

Of course, there are ways to address the growing energy needs of AI without resorting to nuclear energy. “If you reduce your energy consumption, then your carbon impact is going to reduce—there’s no way around it,” Mosharaf Chowdhury, an associate professor of computer science and engineering at the University of Michigan, told The Daily Beast.

Chowdhury helped develop a software called Zeus that analyzes the energy efficiency of training AI. In a paper he co-authored on his study, the team found that the software could potentially help reduce energy consumption in these models by as much as 75 percent.

“Ever since the beginning of deep learning, people have realized that the larger and deeper they make neural networks, the better it performs,” co-author Jae-Won Chung, a computer science and engineering PhD candidate at UM, told The Daily Beast. “So people have been making them larger and larger.”

As they become larger, that increases the need for GPUs. Extrapolate this even further with the growing number of tech companies looking to make bigger and badder AI models, and it becomes an unwieldy environmental nightmare. “This is what motivates my research,” Chung said. “We hope to make [AI training] more efficient.”

I Promoted AI for Years and Automated Myself Out of a Job

One of the best and simplest ways to ensure that these companies and startups are using energy as efficiently as possible, though, is through transparency, according to Luccioni. By recording and reporting the energy consumption data, it gives both users and regulators a clear picture of what models are eating up the most energy and what should be done about it.

Maintaining transparency was something that these AI companies used to do a relatively decent job at—though that all changed with the launch of ChatGPT. That moment saw companies that used to champion transparency and open source coding like OpenAI and Google do away with their former practices in favor of hiding trade secrets from competitors.

Doing so doesn’t just pose a risk to the environment. It also has the potential to cause other forms of real-world harm.

“I look at energy consumption but there’s also people who look at bias,” Luccioni explained. “There are people who do audits and robustness. There are people who do important work with these models but now, with this rat race, you can’t do this work anymore.”

“You don’t know how big it is,” she added. “Now it’s become not only black boxes, but black boxes inside of black boxes. That’s what’s really frustrating.”

Read more at The Daily Beast.

Get the Daily Beast's biggest scoops and scandals delivered right to your inbox. Sign up now.

Stay informed and gain unlimited access to the Daily Beast's unmatched reporting. Subscribe now.