
Lowering operating temperatures will quickly determine whether commercial spaces can connect to heat networks without having to upgrade emitters
The UK government has identified low-temperature heat networks as one of the most cost-effective routes to decarbonising heat. Current heat network coverage is just 3% of total heating demand and it is estimated that this figure will need to grow to 20% over the next 25 years.
For new-build developments, integrating with low-temperature networks is relatively straightforward. For existing buildings, however – particularly commercial ones – connecting to these systems presents technical and logistical challenges.
During the coldest months of the year, the operating temperatures of two buildings were lowered to replicate the average emitter conditions of a low-temperature network. The goal was to assess whether the existing emitters were oversized, and to explore how a building’s age might influence performance under these conditions.
New legislation and zoning
The Energy Act 2023 puts in place the regulatory framework for heat networks in the UK. This includes minimum performance standards, consumer protection measures and, importantly, the introduction of heat network zoning, set to be rolled out in 2025.
Most existing buildings within designated heat network zones will have to connect to a local low-temperature network. These zones will be defined by government and be based on the lowest-cost pathway to decarbonising heat in each area. However, building operators may only be given a year and a half’s notice to make their buildings connection-ready, a short turnaround given the level of investigation and potential upgrades required.
Technical barriers
One of the biggest hurdles to heat network readiness in existing commercial buildings in the UK is the incompatibility of traditional heating systems with the demands of modern low-temperature networks.
Most conventional heating systems in commercial buildings are designed for high-temperature operation of 82°C flow and 71°C return. In contrast, heat networks operate more efficiently, with a higher temperature differential and significantly lower flow and return temperatures, such as 75/50°C. This demands that internal heating systems operate effectively with much lower emitter temperatures.
Lowering the average emitter temperature has a substantial impact on radiator performance. A drop in average temperature of more than 20K can reduce heating output significantly if emitter sizes remain unchanged.
There are three key impacts of replacing emitters: increased capital cost; disruption to building operation during upgrades; and risk of not recovering costs if building occupancy or usage changes. These factors can deter building owners from committing to heat network connections without clarity on the true extent of required upgrades.
A typical way to assess whether a building can perform adequately at lower temperatures is through heat-loss modelling: comparing calculated heat losses with emitter capacities at reduced temperatures. This method is fraught with practical issues, however.
Many buildings lack detailed ‘as-built’ fabric information, especially older assets. Using estimated U-values based on historical regulations can also inflate calculated heat losses, leading to overestimation of upgrade needs. This underlines the importance of empirical data to supplement – or even replace – purely theoretical models. One way to generate this data is through temperature-lowering tests.
A recent case study explored the potential of temperature-lowering tests across two buildings on a wider estate. The objective was to simulate the impact of connecting to a 75/50°C heat network, with an average operating temperature of 62.5°C, using existing systems, and to determine whether the buildings could maintain occupant comfort with existing emitters.
To replicate this, both buildings had their boiler setpoints lowered to 65°C and were compared against a required comfort setpoint of 19-20°C.
The newer building operated with an average emitter temperature of 61°C during the test, with most rooms remaining above the comfort setpoint (Figure 1). However, two rooms struggled to meet the target. On comparison with baseline data, it was evident that these rooms had issues prior to the test, suggesting the reduced flow temperature did not cause a decline in performance. This indicates that minimal or no emitter upgrades are necessary.

In contrast, seven rooms in the older building failed to meet the comfort setpoint (Figure 2). Poor performance was seen during occupied hours, with data indicating that many of these rooms were underperforming before the test. This suggests both pre-existing deficiencies and exacerbation because of lower flow temperatures. In one room, ambient temperature dropped from 15°C to 12°C under identical external conditions post-lowering, indicating a measurable performance decline as a direct result of the test.

The contrasting performance of the buildings highlights a number of insights. First, the newer building’s issues are localised and mostly pre-existing. Further, the older building will probably require substantial emitter upgrades for successful heat network integration.
Temperature-lowering tests provide real-world, building-specific data – helping to reduce the likelihood of unnecessary upgrades – and improve cost certainty by clearly identifying where emitter replacements are genuinely needed. They also boost confidence among building operators by reducing the perceived risk of connecting to low-temperature heat networks.
There are some challenges to this approach, however. One is the cost of installing ambient temperature sensors. Accurate assessment requires flow and return temperature sensors on each heating circuit, which can be costly to retrofit. Despite these limitations, temperature-lowering tests are a valuable, evidence-based approach to informing retrofit decisions and avoiding over-engineering.
About the author
Simran Chaggar is a senior engineer at FairHeat