Datacentres also rely on water, to help cool the humming banks of hardware. In the UK the Environment Agency, which was already warning about a future water shortfall for homes and farming, recently conceded the rapid expansion of AI had made it impossible to forecast future demand.
Research carried out by Google found that fulfilling a typical prompt entered into its AI assistant Gemini consumed the equivalent of five drops of water – as well as energy equivalent to watching nine seconds of TV.
I mean, the UK could say "we won't do parallel-processing datacenters". If truly and honestly, there are hard caps on water or energy specific to the UK that cannot be dealt with, that might make sense.
But I strongly suspect that virtually all applications can be done at a greater distance. Something like an LLM chatbot is comparatively latency-tolerant for most uses
it doesn't matter whether it's some milliseconds away
and does not have high bandwidth requirements to the user. If the datacenters aren't placed in the UK, my assumption is that they'll be placed somewhere else. Mainland Europe, maybe.
Also, my guess is that water is probably not an issue, at least if one considers the UK as a whole. I had a comment a bit back pointing out that the River Tay
Scotland as a whole, in fact
doesn't have a ton of datacenters near it the way London does, and has a smaller population around it than does the Thames. If it became necessary, even if it costs more to deal with, it should be possible to dissipate waste heat by evaporating seawater rather than freshwater; as as an archipeligo, nearly all portions of the UK are not far from an effectively-unlimited supply of seawater.
And while the infrastructure for it doesn't widely exist today, it's possible to make constructive use of heat, too, like via district heat driven off waste heat; if you already have a city that is a radiator (undesirably) bleeding heat into the environment, having a source of heat to insert into it can be useful.:
https://www.weforum.org/stories/2025/06/sustainable-data-centre-heating/
Data centres, the essential backbone of our increasingly generative AI world, consume vast amounts of electricity and produce significant amounts of heat.
Countries, especially in Europe, are pioneering the reuse of this waste heat to power homes and businesses in the local area.
As the chart above shows, the United States has by far the most data centres in the world. So many, in fact, the US Energy Information Administration recently announced that these facilities will push the country’s electricity consumption to record highs this year and next. The US is not, however, at the forefront of waste heat adoption. Europe, and particularly the Nordic countries, are instead blazing a trail.
That may be a more-useful strategy in Europe, where a greater proportion of energy is
presently, as I don't know what will be the case in a warming world
expended on heating than on air conditioning, unlike in the United States. That being said, one also requires sufficient residential population density to make effective use of district heating. And in the UK, there are probably few places that would make use of year-round heating, so only part of the waste heat is utilized.
looks further
The page I linked to mentions something that London
which has many datacenters
is apparently already doing:
And in the UK, the Mayor of London recently announced plans for a new district heat network in the west of the city, expected to heat over 9,000 homes via local data centres.