Okay, so I'm not a big AI guy. It kind of sucks at everything we try to do with it, and it's basically a huge waste of resources right now.
But... Sometimes it's fun to play devil's advocate.
AI consumes shitloads of electricity and water, and produces nothing but slop. Even if they're not using evaporative cooling, that water use impacts the availability of usable water downstream of the data center. Also, it's a huge money pit - last I saw, AI companies weren't really turning a profit.
The article addresses electricity (Altman specifically called out a pivot to nuclear, wind, and solar), but doesn't say a ton about the other issues... Which could all be addressed with coastal data centers.
Don't worry - I'm not about to suggest hearing the ocean up to cool data centers. Instead, why not pivot back to evaporative cooling, but with seawater?
Build the data center, and put some cooling pools around it - twelve seems like a good number. Make the pools big enough that the center can be cooled without the use of all of the pools (this is important). Heat sinks are made of metal, and saltwater is bad for most metals, so slap on a few sacrificial anodes like they're metal-hulled boats. Boom - the data center is now cooled using non-potable water without warming the ocean.
Now, as water evaporates, salt deposits will form in the cooling pools. When a pool gets too salty, it can be drained (or allowed to fully evaporate), and the salt can be knocked off and collected. Boom - losses reduced, data center is now a salt farm. Salt's not really worth much, but it could probably be marked up and sold to tech bros as fancy "AI powered sea salt."
And then, once we've done that, we can train the AI to do something useful, like... Uh... Clean it's own salt pools with a little robot, I guess; it kind of sucks at everything important.