I won't call your point a strawman, but you're ignoring the actual parts of LLMs that have high resource costs in order to push a narrative that doesn't reflect the full picture. These discussions need to include the initial costs to gather the dataset and most importantly for training the model.
Sure, post-training energy costs aren't worth worrying about, but I don't think people who are aware of how LLMs work were worried about that part.
It's also ignoring the absurd fucking AI datacenters that are being built with more methane turbines than they were approved for, and without any of the legally required pollution capture technology on the stacks. At least one of these datacenters is already measurably causing illness in the surrounding area.
These aren't abstract environmental damages by energy use that could potentially come from green power sources, these aren't "fraction of a toast" energy costs only caused by people running queries either.
Querying the LLM is not where the dangerous energy costs have ever been. It's the cost of training the model in the first place.