AI represents a pressing threat to humanity, but not in the way you think.
AI promises to be the most transformative technology in human history. But what transformation are we talking about?
AI proponents promise the models will change how we think, act, and work. But AI might change our lives in other, much more profound ways.
Two existential crises dominate our generation: artificial intelligence and climate change. People are genuinely terrified that what we get from AI is Blade Runner or Black Mirror’s horrifying robot dogs. But there is an immediate and pressing concern lurking in the background. Artificial intelligence and climate change are on a collision course.
Market forces.
I researched this article using GPT-4 (irony alert!). I collected recent articles and blog posts about the build-out of AI infrastructure, the investment in AI companies, and the development of new AI models.
There are unstoppable market forces at work. Nvidia’s recent financial results generated sentiment bordering on irrational exuberance. Microsoft’s $10B investment in OpenAI and Amazon’s $4B investment in Anthropic created palpable giddiness. Venture firms are laying tens of billions in bets on the roulette wheel and crossing their fingers that at least one lands. For the winners, these are career-making guesses. General Partners and founders will become billionaires.
The industry is caught in the momentum of excitement, transformation, and disruption, motivated by a little altruism and a lot of avarice. Very few are pausing to think about the real impacts of this amazing new technology.
One quote in particular struck me:
“For every $1 spent on a GPU, roughly $1 needs to be spent on energy costs to run the GPU in a data center.”
The author’s basic thesis was we are overbuilding AI infrastructure, and that’s a good thing. There’s money to be made, damn the consequences. The most pressing threat, the author continued, is companies failing to match this $200B of GPU capital expenditure to true customer value.
The author isn’t alone in his sentiment.
Hugging Face CEO Clem Delangue believes in wild proliferation of AI models.
“All companies will ultimately want to build AI themselves (based on open-source AI) versus outsource to third-party APIs and there will be as many models as code repos today. AI is a foundational technology to build tech and the same way you don't outsource or have the same code-base as your competitors, you won't want to outsource your AI development or use the same models as your competitors.”
I am not sure whether he really means this. GitHub alone has 375 million repos. I didn’t analyze how much infrastructure you’d need to train and run 400 million models, and I doubt anyone has.
To summarize, the industry’s collective proposal is to add roughly twenty million GPUs to train and run hundreds of millions of AI models.
We shouldn’t be worried about whether companies can build enough (open-source) AI models or uncover enough customer value to justify $200B in GPU spend. We should worry about the climate impact of executing this bold vision.
Real impacts
“For every $1 spent on a GPU, roughly $1 needs to be spent on energy costs to run the GPU in a data center.”
This rather bland assertion skips over the fact that this spending on energy costs has a real, profound impact on climate. Let’s dig in a bit, shall we?
The semiconductor business is horrible for the environment. Manufacturing each 300mm silicon wafer consumes thousands of liters of water and consumes staggering amounts of other materials and natural resources. The outcome of that manufacturing process, a typical data center could have as many as 500 tons of rare earth metals, 40 tons of silicon, and 1500 tons of plastics.
Companies are building ever more powerful supercomputers running tens of thousands of GPUs and hundreds of thousands of CPUs. These supercomputers consume massive amounts of power and water. A single supercomputer stuffed with 10,000 Nvidia H100s consumes 26 GWh per year, and Google’s A3, with its 26,000 GPUs, gobbles an additional 68 GWh per year.
The average hyperscale data center already consumes 730 million gallons of water per year. That’s enough water to grow over 1 million pounds of rice.
That same data center consumes 500 GWh of energy, enough to power almost 48,000 homes and 400 times the energy required to send Marty McFly back to the future.
There are more than 500 hyperscale data centers globally, about half in the US.
Remember, the industry expects to add 20,000,000 GPUs to an already massive AI computing infrastructure.
When AI supercomputers proliferate across data centers, they’ll easily consume up to 8% of all energy produced on California’s power grid per year. The state’s power grid is already teetering. California has experienced 99 unplanned outages over the past five years, second only to Texas. Utilities only avoided additional outages through severe but voluntary usage restrictions. Residents in rural parts of the state experienced up to six weeks without power in 2021.
The Colorado River basin, which serves the water needs of 40 million people, has lost more than 10 trillion gallons of water. Climate change-driven drought accelerates the depletion of water resources, hitting the western United States particularly hard.
Yes, you say, but the GPUs will be faster, cheaper, and more efficient. Which means we will run MORE of them. We’ll use them to train ever larger and more complex models. The consumption and environmental impact will continue to increase.
Can hyperscale be carbon-neutral?
Yes, some companies are claiming they run carbon-neutral data centers and offices. Google likes to tout its carbon neutrality, mostly achieved by purchasing carbon offsets and through power purchase agreements. In 2021, Google’s total carbon emissions exceeded 12 million metric tons.
Despite that, parent company Alphabet managed to make the Carbon Disclosure Project’s Climate Change A List, a register of the top 330 global companies.
Microsoft made this list as well, performing similarly to Google, emitting roughly 15 million metric tons, buying significant carbon offsets, and making power purchase agreements. Amazon did not. When reached for comment, Jeff Bezos’ spokesperson said, “I can’t hear you. The super yacht’s engines are too loud!”
Google is clearly trying to do the right thing, but their top-level boast–we’ve been carbon neutral since 2007–hides the underlying reality. Here is a quote from their own report to CDP, an organization that doesn’t recognize carbon offset purchases in their calculations of carbon neutrality:
“In 2021, we matched 100% of our annual global electricity consumption with renewable energy for the fifth consecutive year, but on an hourly basis, only 66% of our data center electricity use was matched with regional carbon-free sources.”
Google isn’t lying, per se, but they also aren’t exactly telling the truth. Here’s the marketing spin:
“We are the first major company to make a commitment to operate on 24/7 carbon-free energy in all our data centers and campuses worldwide. This is far more challenging than the traditional approach of matching energy usage with renewable energy, but we’re working to get this done by 2030.”
Here, Google says they are taking some action more challenging than the traditional approach of matching energy usage with renewables. But are they?
By all means, Google, get your data centers and offices on carbon-free energy by 2030. But in the meantime, be transparent. Also, consider the impact of investing almost $10B in data center and office expansion just in the US and the environmental impact of cramming those data centers with A3 supercomputers.
However you slice it, these companies’ consumption of resources is intense.
What do we do?
I’m a pragmatist who spends a lot of time analyzing actions and consequences. Optimists tend to call this cynicism.
We love how technology improves our lives. There’s plenty of evidence that despite floods of unrelentingly negative headlines, we are actually much better off than at any other time in human history. This is undeniably true; at the same time, it’s true that our past actions to expand our opportunities and improve human existence had undesirable consequences (See: manifest destiny and industrialization).
I’m not a pessimist and am enamored of AI’s possibilities.
I used GPT-4 and DALL-E to produce this article, so I am contributing to the madness. Not using AI is like trying to go to a store and buy something not packaged in plastic. It’s possible, but you’ll end up brushing your teeth with your finger and a smear of baking soda.
To solve these big, existential problems, we have to begin with a single step.
Start by building less software.
Question every product decision through a climate impact lens. Do I really need to add this feature? If I do, how much will it increase my product’s resource consumption? Are there things I can do in my product to take better advantage of autoscaling, sizing up and down according to need, and avoid overprovisioning infrastructure?
For the software you build, develop metrics to measure your teams on energy consumption and power efficiency. Focus on a couple of areas: energy efficient algorithms–there’s a ton of research on this, including practical examples–and creating products that reduce the consumption of network resources.
Train developers and product managers on green coding practices. Which programming languages are more energy efficient? Are the open-source libraries your product incorporates developed to be power and resource-efficient?
A deliberate climate-first approach means prioritizing over other things that might drive short-term customer acquisition and revenue. Depending on the size and stage of your company, prioritizing climate-friendly work over revenue might come at a significant cost.
What’s the cost if we don’t make these changes?
Resources
Colorado River Basin Water Crisis
~~~~~~~~