cross-posted from: https://lemmy.world/post/37715538
As you can compute for yourself, AI datacenter water use is not a substantial environmental problem. This long read spells out the argument numerically.
If you'd like a science educator trying to make the headline claim digestible, see here
Expanding on this: Even if we take the absurd values of LLM growth from the industry, current and projected freshwater use of AI datacenters will still be small compared to other obviously wasteful uses. This is especially true if you restrict to inference, rather than training, resource use. Once a company has already trained one of these monster-models, using it to respond to a content-free work email, cheat on homework, lookup a recipe, or help you write a silly html web page is usually freshwater savings, because you shower and use the toilet surprisingly often compared to the cooling needs of a computer.
I will acknowledge the nuance I'm aware of:
- we don't know the specific tech of the newest models. It is theoretically possible they've made inference require burning several forests down. I think this is extremely unlikely, given how similar they behave to relatively benign mixture-of-experts models.
- some of the numbers in the linked long-read are based on old projections. I still think they were chosen generously, and I'm not aware of a serious discrepancy in favor of 'AI water use is a serious problem". Please do correct me if you have data.
- there is a difference between freshwater and potable water. Except that I can't find anyone who cares about this difference outside of one commenter. As I currently understand it, all freshwater can be made potable with relatively upfront investment.
(Please note this opinion is not about total energy use. Those concerns make much more sense to me.)
Practical guidance queries should be compared against searching for practical guidance, yes? So if you would be searching 4-5x times, the AI has cut that time from the process. Especially so if you find a guide that lacks one extra bit of context, AI lets you ask the follow up and get an answer in the same format and context, while search would require re-reading what you already know and cross checking it. If the knowledge you need isn't written in a convenient place/format, and you would have used a person, then the AI has successfully cut humans in the loop in half.
I know less people who do this intentionally (see involuntary AI use below). Those I do are using it for stack-exchange style questions, where the information is highly context specific, probably only present in a few forums, and would require a lot of effort to get a precise search result (lots of AND's and NOT's and site filtering). I think these difficult searches are probably not what 'seeking information' usually means, and would agree this use is not great.
This one depends a lot on book maintenance, construction, and availability. Note libraries, bookstores, and ebook hosting take labor, power, and water too.
Most generated fiction is in niche genre's I think, so the cost of getting a human to write it would be astronomically worse. And while I am just as happy reading the original Dracula instead of an ultra-specific undertale fanfic, I have a hard time telling someone else that they are literally interchangable.
Yeah I do endorse these uses as efficient. They are bad/stupid/silly. I've disabled them where possible, and welcome others to do the same. That said, this water waste is likely small compared to other (equally terrible) industrial practices (we don't need to triple wash every carrot, and powerwashing various vehicles and surfaces is often not efficient or needed).
Yeah, I've just had a terrible time finding actual humans providing recipes on the internet. I am entirely prepared to believe this is a skill issue on my part. YouTube has helped somewhat, but now we're comparing an LLM to video hosting+processing and ~5 minutes taking careful notes along the way.