The thing is, right, there’s a bunch of graphic processors, like tens of thousands of ’em, used to train these really smart language models. You know, creating these big AI systems. And, apparently, the energy they consume – the amount of juice they need – is straight up nuts. Climate folks are worried about the impact it could have in the long run. But, the Center for Data Innovation – backed by tech heavyweights like Intel, Microsoft, Google, Meta, and AMD – says all this anxiety about AI’s energy usage is kinda overblown. In fact, the group reckons AI might actually be good for our climate, using more efficient processes and making things work better. For instance, did you know that using AI to write a page of text can create way less CO2 emissions than if, say, a regular American did the same thing using a standard laptop? And the group also says that the amount of power used to train these AI systems is way less than what’s used to actually use them. The thing to remember, the Center argues, is that AI help to make things more efficient by doing a bunch of stuff, like forecasting grid demands or creating systems that make things like farming and tracking methane emissions much better. But even with all that said, the group reckons we’ve got a lot of improvement ahead of us when it comes to being able to properly measure how much power and carbon emissions come from using AI systems.
Now, for some context – this isn’t anything new. People have been flipping out about technology’s energy consumption before. It’s like, back in the day when folks predicted that the digital economy would take over half the electric grid’s resources within a decade. Now, it’s barely 1-1.5 percent of the world’s energy use. But here’s the thing, right – measuring how much energy it takes to train, or use, an AI system is all over the place. And past attempts to figure it out just haven’t aged well; assumptions, wrong measurements, and not keepin’ up with the pace of hardware and software innovation. Even though AI hardware is gettin’ better and models are getting faster, they’re not necessarily gonna be more efficient power-wise.
And there’s more! The report’s got some ideas about how to sort out this whole energy usage issue – how we can measure it properly and make sure we improve. Like, the first thing is to measure how much energy these AI systems are using when they’re being trained, and when they’re actually working. After that, we just need to encourage companies to voluntarily report it. But here’s the kicker; the report says that trying to regulate the industry could be a mess. For example, making AI models safer could make them more power-hungry. Then we’re stuck. Sounds like we should be investing in some AI stuff!
Oh and hey, who’s gonna benefit from all this? Like, duh, the companies who make the hardware and the chips that make AI work better. We’re talking about Nvidia, who’s seen their business take off as the demand for AI hardware goes nuts. Sho’nuff!