Relativization of energy requirements…
At the AI Impact Summit he once again had to answer such questions and resorted to a somewhat daring comparison. Anyone who is bothered by the energy costs of AI training must ask themselves how much it takes to “train” a human. The electricity and water consumption of AI is certainly high, but he does not see the magnitude as relevant compared to the growth of a human being and their diet. At first glance, the theory may sound correct – in fact, you need a lot of energy in your first 20 years of life (and the same later) – but it remains a strange comparison.
…with a limping example
When it comes to AI, clearly measurable electricity and water requirements are often discussed, and here all living costs suddenly end up in one pot – including things that only have a very indirect connection with cognitive training. In addition, in his comparison, clearly measurable processes compete with a fuzzy and unquantifiable value. There’s also the simple fact that humans don’t suddenly cause aggregate demand to explode (unless they use AI services!) – but to a large extent, AI does exactly that.
The question should rather be: Are there long-term efficiency gains?
Ultimately, the comparison serves less as a robust analysis than as a rhetorical change of perspective. The crucial question is not whether intelligence (whether biological or artificial) requires energy, but rather how much additional demand is actually created by AI, where it occurs and how it can be met. If there were long-term efficiency gains that justified the use of resources, there would hardly be any discussions – but that is exactly what most of the criticism is aimed at, namely the unclear outlook as to the extent to which the investments are sustainable.

