OpenAI’s O3 Model Devours 5 Gas Tanks Worth of Energy Per Task

openai o3
OpenAI O3 environmental impact
OpenAI's O3 Model's Shocking Energy Consumption: Each Task Burns Through 5 Tanks of Gas Worth of Power

In a groundbreaking revelation about artificial intelligence‘s environmental impact, OpenAI’s latest and most powerful model, O3, has been found to consume a staggering 1,785 kWh of energy per task—equivalent to two months of an average American household’s electricity usage.

Advertisement

According to research by Salesforce’s AI sustainability lead, Boris Gamazaychikov, each O3 task generates approximately 684 kilograms of CO₂ equivalent emissions, comparable to burning through more than five full tanks of gasoline. These findings raise serious concerns about the environmental sustainability of advanced AI systems.

OpenAI Brings ChatGPT to WhatsApp, Making AI Access Simpler for Billions

The True Environmental Cost of OpenAI’s

  • Energy consumption: 1,785 kWh per task
  • CO₂ emissions: 684 kg CO₂e per task
  • Equivalent to: 2 months of household electricity
  • Power usage: 11-12 kW for an HGX server with 8 Nvidia H100s

Industry experts have pointed out that these calculations might be conservative. Kasper Groes Albin Ludvigsen, a green AI advocate, notes that the actual power consumption of HGX servers with Nvidia H100s is significantly higher than initial estimates, drawing approximately 11-12 kW compared to the predicted 0.7 kW per GPU.

The environmental concerns extend beyond energy consumption. Earlier studies revealed that ChatGPT uses approximately 10% of an average person’s daily drinking water consumption per chat session—a seemingly small amount that scales to significant levels with millions of daily users.

Sustainability Challenges and Solutions

  • Companies like Synaptics and embedUR are developing edge AI solutions
  • Focus on reducing data center dependence
  • Emphasis on real-time, device-level decision making
  • Growing need for balanced innovation and sustainability

Kathy Baxter, Principal Architect for Responsible AI at Salesforce, warns of potential Jevon’s Paradox implications, where efficiency improvements might lead to increased resource consumption in other areas. “There can be efficiency tradeoffs where less energy is required, but more water is used,” she explains.

As AI technology continues to advance, the industry faces mounting pressure to address these environmental concerns while maintaining technological progress. The findings underscore the critical need for sustainable AI development practices and more efficient computing solutions in the rapidly evolving field of artificial intelligence.

How Google Makes More Money from Windows than Microsoft ?

Advertisement