- cross-posted to:
- generative_ai@mander.xyz
- cross-posted to:
- generative_ai@mander.xyz
one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.
And nobody seems to give a shit. Even people who would normally give a shit about this sort of thing. Even people who do things like denounce Bitcoin mining’s waste of energy (and I agree) are not talking about the energy- and water- waste from AI systems.
That article says that OpenAI uses 6% of Des Moines’ water.
Meanwhile-
According to Colorado State University research, nearly half of the 204 freshwater basins they studied in the United States may not be able to meet the monthly water demand by 2071.
https://abcnews.go.com/US/parts-america-water-crisis/story?id=98484121
And nobody seems to give a shit.
I guess it depends on how you use chatbots. If you’re just too lazy to click on the first google result you get, it’s wasteful to bother ChatGPT with your question. On the other hand, for complex topics, a single answer may save you quite a lot of googling and following links.
Tbf, talking about the environmental costs of generative AI is just framing.
The issue is the environmental cost of electricity, no matter what it is used for.
If we want this to be considered in consumption then it needs to be part of the electricity price. And of course all other power sources, like combustion motors, need to also price in external costs.It should be considered, an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI, the AI server has higher power requirements (it may not be as wide of a margin from my first comparison, but there is one), now multiply that AI server and add hundreds more and you start seeing a considerable uptick in power usage.
an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI
And if external costs are priced into the cost of electricity then that will be reflected in the cost of operating these devices.
Also there are far more data servers than servers running AI which increases the total effect they have.