How can video game technology be applied to real-world applications? Here are the highlights from our webinar panel discussion.
‘Architects are not just in the business of creating buildings, but also of creating experiences. Gaming engines allow us to be able to be part of that experience early in the design process, even from concept stages,’ says Martha Tsigkari, Senior Partner – Head of Applied R+D at Foster + Partners. The ability to render in real time and incorporate different designs and models quickly are key advantages of gaming engines over traditional visualisation tools, she explains.
Gaming engines give clients a first-person experience, allowing them to move around and interact with the environment and better understand the construction process, says Terence Caulkins PhD, Associate – Advanced Digital Engineering at Arup. ‘When you are constructing a building, you only get one shot at it and so to have a virtual environment that allows you to simulate and get feedback from the end user is a very powerful proposition,’ he says.
Martha Tsigkari agrees. At Foster + Partners, gaming engines are also being used as a design collaboration tool, helping to manage client expectations, aspirations and changes. This allows people to participate in design reviews wherever they are in the world through PCs, tablets and VR headsets. Glaucon, Foster + Partners’ own proprietary tool, can be used at the design stage and also allows virtual mock-up inspections and walk-throughs in real time of buildings under construction.
Terence Caulkins PhD
Associate - Advanced Digital Engineering, Arup
Visualisation and sonification allow planners and architects to build empathy for a broader population and better understand how others experience places and the daily challenges they face. For example, Arup is using a gaming engine to simulate the experience of neurodivergent people who might find nearby sounds disruptive, uncomfortable and stressful when sitting at a work desk.
Foster + Partners has been working with UCL and City University on an app that emulates different visual impairments and how those that suffer from them perceive space to enable design decisions to be more inclusive.
Using gaming engines can help engage communities in consultation processes and facilitate better understanding of projects, for example, how a new building may affect a neighbourhood. “The best way to help them understand is to put them in that experience, allow them to be part of it virtually and then make a decision,” says Martha Tsigkari.
COVID-19 made public consultation complicated, as town hall meetings were no longer being held in person. Tools such as Arup’s Virtual Engage, aim to overcome such difficulties and provide greater overall public engagement. Using renderings created in a gaming engine or 360-degree photos, the tool allows the public to virtually visit and move around a building or public space and provide feedback online.
For the expansion of London Heathrow Airport, Arup built booths and used VR headsets to recreate the auditory and visual experience of living under a flight path, emulating the sound of aeroplanes and the visual impact of being overflows by different types of aircraft.
One of the first examples of using gaming engines by Arup was in 2005 on a courthouse design in Jackson, Mississippi. Traditionally, before building a courthouse a full-scale mock-up would be made. The judge would go in, people would sit in the jury box and the design would be tested. The ‘aha moment’, says Terence Caulkins, was being able to virtually recreate the courthouse in a gaming engine, simulating the acoustics and lighting in the space, saving time and money for the client and reducing waste.
With the use of any data-based application, interoperability is a central element. Otherwise, explains Martha Tsigkari, you can end up with assets sitting on different software platforms that don’t speak to each other, which is a problem ‘because the entire power of using a game engine is to be able to have a real-time understanding of the effects of your design decisions in terms of the experience,’ she says. Foster + Partners has its own application for seamless interoperability, called Hermes. In parallel, they have been investigating software like NVIDIA’s Omniverse, which harnesses the power of Pixar’s Universal Scene Description to provide a common language and schemas between software.
Martha Tsigkari
Senior Partner – Head of Applied R+D, Foster + Partners
Introducing haptics into the virtual world is the next sensory step that needs to be fully realised, says Terence Caulkins. Physical touch is important and there are limitations currently with what can be achieved with gaming engines.
Pushing computation into the cloud will become more common, letting those with less powerful devices experience simulations in high-frame rates through pixel streaming. The ability to virtually walk through something, interact with it and give feedback will become more widespread, he adds.
Gaming engines will improve digital planning and transform planning applications, says Martha Tsigkari. There is a big discussion around the metaverse and what it would entail in the digital space of architecture, when artefacts are no longer just physical but also digital, she says. But it’s all about real-time experiences, she comments. ‘And I think this idea of real-time experiences throughout the design process is something that is increasingly being incorporated into workflows and facilitated through gaming engines,’ she concludes.
In this webinar a panel of experts explore the potential provided by the latest gaming engines to aid architects, urban planners and others visualise and simulate concepts and projects, as well as carry out tasks such as building measurement and virtual site visits.