A recent article in The Telegraph, ‘Energy scandal: misleading efficiency claims leading to huge bills for homeowners’,1 caused quite a stir in the building physics community by stating that modellers are making misleading efficiency assertions.
The article was based on a University of Bath paper published in CIBSE’s academic journal BSERT2. The study intended to test the judgement of energy modellers when it comes to the relative significance of modelling input parameters. A sample of modellers, each sent a questionnaire giving a description of a real house, were asked to rank a range of specified changes-to-input parameters by the influence each were likely to have on the building’s annual heating demand. This task was to be carried out without modelling software.
It is unusual for modellers to be asked questions in this way. The benefit of modelling software is that it allows us to test efficiently, in a virtual environment, how multiple building variables interact dynamically. Such interactions are not necessarily straightforward because of the complex interplay of all the parameters. For example, the influence of wall insulation on energy consumption will vary, depending on the level of internal gains and the effectiveness of ventilation.
We take great issue with the blanket criticism meted out to modellers by this study
This task requires a lot of information, and for modellers – or any construction professionals – to predict the often subtle relative changes in outcomes without using these tools is a tough ask. This view is supported by the accuracy of responses being generally poor, and not improving when participants had higher levels of qualifications/experience.
Having good instincts for the outcome of models is a valuable skill – it enables the development of more efficient models, and helps errors to be picked up and corrected in the process. But we are not convinced this test represents a fair way to assess this skill. The Telegraph article makes the leap that poor results in this test suggests responsibility for the performance gap in new homes – even likening it to the VW emissions scandal. This seems an unfair conclusion to reach, based on the nature of the study carried out.
Dynamic thermal modelling for energy use prediction is rarely performed commercially on new homes that require SAP modelling for Building Regulations Part L compliance. The article seems to point the finger at modellers for a performance gap related to the discrepancy between compliance calculations (for Part L) and in-use energy consumption. As most practitioners are aware, the two are not well correlated, because compliance calculations are not intended to predict in-use energy. For example, they omit unregulated building loads and assume normalised building use. For this reason, the industry has adopted CIBSE TM54 for in-use energy prediction; this guides modellers in producing reliable energy prediction models, identifying the sensitivity of the results to changes in the input parameters.
The study suggests that modellers should have better instincts – and, with that, we’d largely agree. The modelling community does need to take responsibility for its models, working closely with the design team to validate input assumptions, questioning values that appear unrealistic, and being confident that they can justify their findings. We do our profession a disservice and invite criticism if we are not able to account for our work clearly. Modelling reports must give a thorough account of the input assumptions used, and clearly explain the methodology followed and what the results imply. It is unprofessional not to give this information, and means that discrepancies in the assumptions cannot be picked up. Isolated results do not make sense without the details on which they were based.
We take great issue with the blanket criticism meted out to the modelling community by this study, but agree that the industry needs to address the performance gap by having an open and honest discussion about the ways we work together. To point the finger at any one discipline is to disregard the influence of all the other players in this team sport of building design and construction. In particular, designers – including modellers – need to complete the circle of building design by understanding how our buildings perform in the real world, and use this feedback to create better designs, more appropriate models, and improved ways of working together.
- Energy scandal: Misleading efficiency claims leading to huge bills for homeowners, 2 May 2017, The Telegraph.
- The building performance gap: Are modellers literate? Salah Imam, David A Coley and Ian Walker, January 2017, BSERT.
- Susie Diamond is founding partner at Inkling
- Dr Claire das Bhaumik FCIBSE is a partner at Inkling and a Level 3, 4 and 5 EPC assessor, and CIBSE Low Carbon Consultant – Building Design and Simulation