PDA

View Full Version : Time to Go Nuclear?



ClydeR
12-18-2023, 10:58 AM
Alphabet’s chairman indicated in February 2023 that interacting with an LLM could “likely cost 10 times more than a standard keyword search.6" As a standard Google search reportedly uses 0.3 Wh of electricity,9 this suggests an electricity consumption of approximately 3 Wh per LLM interaction. This figure aligns with SemiAnalysis’ assessment of ChatGPT’s operating costs in early 2023, which estimated that ChatGPT responds to 195 million requests per day, requiring an estimated average electricity consumption of 564 MWh per day, or, at most, 2.9 Wh per request. Figure 1 compares the various estimates for the electricity consumption of interacting with an LLM alongside that of a standard Google search.

These scenarios highlight the potential impact on Google’s total electricity consumption if every standard Google search became an LLM interaction, based on current models and technology. In 2021, Google’s total electricity consumption was 18.3 TWh, with AI accounting for 10%–15% of this total.2 The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland (29.3 TWh per year), which is a significant increase compared to its historical AI-related energy consumption. However, this scenario assumes full-scale AI adoption utilizing current hardware and software, which is unlikely to happen rapidly. Even though Google Search has a global reach with billions of users, such a steep adoption curve is unlikely. Moreover, NVIDIA does not have the production capacity to promptly deliver 512,821 A100 HGX servers, and, even if it did, the total investment for these servers alone for Google would total to approximately USD 100 billion.5

More... (https://doi.org/10.1016/j.joule.2023.09.004)


LLM=Large Language Model
Wh≈Watts per hour
MWh=1 million Wh
TWh=1 trillion Wh

A normal LED 60-watt-equivalent light bulb uses 9 or 10 Wh. I was shocked to hear that an LLM interaction uses the equivalent of 20 minutes of burning a light bulb.

The CHIPs Act was a good move. But we also need to focus on electricity production and infrastructure. Don't forget that electric cars are just around the corner. Roughly, it appears that the average EV uses 250 to 350 Wh per mile, which is a lot less than I would have expected.

The presidential candidates need to lay out their plans for electricity production.


Will improved efficiency in AI training and operation help the problem?


However, this perspective overlooks Jevons’ Paradox, which was formulated in 1865 and occurs when increasing efficiency results in increased demand, leading to a net increase in resource use. This effect has long been observed in the history of technological change and automation,12 with recent examples in AI applications.13 In fact, the sudden surge in interest in generative AI during 2022 and 2023, during which demand began to outstrip supply, could be part of such a rebound effect. Moreover, the improvements in model efficiency now allow single consumer-level GPUs to train AI models. This implies that the growth in AI-related electricity consumption will originate not only from new high-performance GPUs such as NVIDIA’s A100 and H100 GPUs but also from more generic GPUs.

Suppa Hobbit Mage
12-18-2023, 01:38 PM
nah

Methais
12-18-2023, 03:52 PM
Time to go nuclear?



https://www.youtube.com/watch?v=iCbz6BxUMuY

Anebriated
12-18-2023, 05:29 PM
Just install more windmills and solar panels. They’re green!