top of page
Search

AI, Energy, and the Cost of LLM Queries

  • Aaryan Doshi
  • Sep 13
  • 2 min read


ree


The introduction of mainstream LLMs to society in 2022 changed the way that we get work done. Instant answers, comfort, and even companionship in some cases. But, as big tech companies aggressively pursue construction of datacenters, what impact will this have on the cost of energy? Can our GPUs become more energy efficient? And is this surge in electricity sustainable?


Today, I want to break down precisely those 3 questions.


A Data Center Spree in the United States


The numbers speak for themselves. In the first half of 2024, 78 data center projects started construction, leading to a 9 billion dollar cost and 12 million square feet of land usage, per the Dodge Construction Network. And it’s not just the cost or space. The power capacity under construction tells an equally indicative store. Construction capacity in primary markets hit a record 6350 MW at the end of 2024, which was a two-fold increase from the 3077.8 MW at the end of 2023, according to CBRE’s North American Data Center Trends report. And with companies such as Meta, Amazon, and Microsoft hungrily pursuing more compute power for their products, the demand is likely to skyrocket even further. So the market is there, the demand is there, but what exactly is at stake here.


The Scale of Energy Impact


To put this into perspective, MIT Technology Review’s analysis revealed that a single GPT conversation uses ~ 2.9 kilowat–hours of electricity – or ~100 miles on an e-bike. When you multiply that across the billions (soon to be trillions) of AI interactions, the impact becomes staggering. But you knew that already. More importantly, is this issue fixable? And are corporations actively searching for a solution to this problem? Is there a need?


Industry Response to Electricity Surge


A recent analysis by consulting firm Deloitte notes that energy demand has grown 50-60% per quarter since the beginning of 2023. And we still haven’t reached the peak. With tools like Cursor and Claude Code, queries to these LLMs are soon to become a standard. Depleting levels of GPU space are prompting companies to search for solutions to this energy crisis. For one, OpenAI recently announced a 300 billion dollar partnership with Oracle to utilize its cloud computing powers. But, can we possibly create a solution that allows for efficient GPU usage, reducing energy usage, and thereby helping the climate?


We’ll continue this thread on the next article of CirFin Times!



 
 
 

Comments


© 2021-2025 Cirfin. 501(c)3 Non-profit.

bottom of page