The show opens by querying ChatGPT about its energy consumption per query, revealing inconsistencies in its responses and highlighting the complexity of calculating AI's energy footprint. The discussion then broadens to consider AI's overall energy demands and its potential to become a major electricity consumer by 2030.
ChatGPT's energy consumption per query varies significantly, even within short timeframes.
One study suggests a large language model like GPT-3 consumes 0.5 to 1.5 kilowatt-hours for training.
Independent estimates suggest a single ChatGPT query uses as much electricity as an LED light bulb for 20 minutes.
AI is projected to consume 8-18% of total U.S. power demand by 2030.
Just one query to ChatGPT takes as much energy as powering a lightbulb for some 20 minutes. As AI balloons, so does its demand for electricity. That's driving tech companies to increase their demand for fossil fuels.