How do panellists assess the advantages and limitations of parametric cost modelling? How do you ensure good quality data? And do common data environments have an inhibitive effect on innovation? During our recent webinar with Linesight, you asked us. Here, our expert panel responds.

Kay Pitman

World Built Environment Forum Manager, RICS

Our expert panel:

Alan Muse, FRICS, Head of Building and Construction Standards, RICS

Nigel Barnes, Head of Life Sciences, Linesight

Steve Townsend, Project Controls Director – Global Capital Projects Group, GSK

 

A parametric approach to cost estimation provides an estimate based on a calculation or algorithm and uses historical data. How do the panellists assess the advantages and limitations of parametric cost modelling?

Nigel Barnes: By breaking any project into its component parts, one can usually find equivalents to benchmark the elements against. If there are unique parts, or a new technology, then this must be costed separately. The information is only a benchmark to be used as a guide, it is not a replacement for a cost estimate.

Alan Muse: In particular, the accuracy of this technique is largely determined by the volume and depth of the data. It is a tool that can be useful if coupled with simulation and sensitivity analyses. However, as with all estimating techniques, the skills and experience of the professional will be important in critically challenging and interpretating the results.

 

We’re all familiar with the equation that bad data in = bad data out, but what steps can be taken to ensure that good data input always leads to good data outputs?

Alan Muse: It’s a process that turns inputs into outputs. Therefore, standards, adequate training and skills, assurance and automated checking can all be invaluable in ensuring a good process. This is not necessarily a closed loop - discussion, dissemination and analysis of the external environment is equally important to ensure that good data outputs are obtained.

Steve Townsend: Through using individuals who spend their days delivering projects to analyse the data, we have a good chance of understanding the background to the data. By having a good understanding of the background, these experienced individuals also help ensure the data is added to the overall data base accordingly.

Nigel Barnes: Good data in should at least originate from a standardised approach or Cost Breakdown System. It should also be checked or assured as part of the benchmarking process.

 

Is there any danger that common data environments could have an inhibitive effect on innovation? After all, if we’re all reading the same data, won’t we all reach the same conclusions?  

Alan Muse: Possibly, but the fact is that the vast majority of decision-making problems are due to poor data, inadequate analysis and bias. The first step is to achieve more standardised data. Once you have this standardised data, you can then creatively assess the options this provides. That way, innovation is not curtailed, but the mistakes are eliminated.

Steve Townsend: You could look at this the other way around. You could say that improvements and innovation come from a common baseline. Such a baseline provides the common foundation from which we understand the improvements required.