Imagination Creative Director, Tony Currie and Digital Production Director, EMEA, Paul Marsden, take us behind the scenes of our latest ground-breaking work with Epic Games’ Unreal Engine. Below, they talk us through the creative process of bringing a story to life using real-time virtual production.
Imagination has a long history of unseen technical innovation. From designing and devising solutions to aid our creativity and our clients' messaging, to intelligently glueing together divergent technologies, to creating powerful brand stories and digital and physical experiences; it’s a strong part of our success.
Imagination’s leveraged the power of Unreal for live experiences; producing location-based VR experiences for Land Rover, photo-real CGI animals for interactive installations and immersive product deep dives. Given this existing relationship with Epic Games and Unreal Engine, virtual production was a logical technology to explore next.
How to tell a story using real-time virtual production
A small team was assembled from three core specialisms (creative, technical and production) and together we went on a journey to learn as much as possible. Using our test lab at Store Street as a proving ground, leveraging existing real-time assets we had created, adapting those environments for a virtual production pipeline. We tested studio lighting, cameras, object tracking, LED and OLED screens. Then we got under the hood of the nDisplay pipeline, exposing ourselves to genlock, framelock, screen sync and the interrelationships of networking, content creation and hardware that go into delivering final pixels to camera. At the end of the process, we had our first 10 seconds of fully VP test footage.
We then identified six considerations and opportunities to steer the creative process further. The shoot would take place in the Epic Innovation Lab in London:
Epic’s XR stage was 5m x 5m and not set up for a commercial shoot.
Triumph offered to lend us their new Trident 660 Motorcycle.
Most VR examples used desert scenes.
The light from the LED is a huge advantage over greenscreen.
In studio-based filmmaking, reflections are difficult and typically rely on post-production.
The advantage of a games engine is world-building and interactivity.
So with these key considerations in mind, we set off to create a narrative that would tell a unique story of virtual production and show off some of the real advantages of using this technology. We wanted the story to be punctuated by several powerful moments:
Go big via a small XR stage and create a punchline in breaking the 4th wall.
Create a cinematic opening that the audience would mistake for the start of a feature-length film, all whilst on a tiny set.
Build a unique world set in the future and flood it with both natural and neon light, to demonstrate how adaptive and responsive the technology is.
Show the Triumph Motorcycles bike as a physical prop but use a digital twin to give depth.
Show reflective surfaces such as sunglasses, textiles, puddles and metallics to make a strong case in point for the advantage of little to no post-production.
With a love of sci-fi and the tech-noir worlds imagined by directors such as Ridley Scott and Zach Snyder, we were inspired to build a near-future dystopian world, where we could transition between day and night to explore the playfulness of light. Out of this world the protagonist, Phoenix Jaxon, was created. Born with the unique ability to experience the emotions and senses of sentient beings, Phoenix uses these powers to help solve crimes. She works for the downtrodden, forgotten fringes of society hoping to balance the injustices of a privatised and capitalist world, where only the rich can afford justice.
An agile creative process
From the outset, we knew we needed to define a workflow enabling us to prototype our virtual set quickly, place it in the physical space set, test the limitations of our hardware and then refine / learn from the outcome until the point of actual shooting. Basically, build, test, learn, refine.
Initially, we started tech visualisation, reviewing exactly how Epic’s small stage would influence and determine what we could film. We generated a millimetre accurate Unreal mock-up of the 5x5m stage, whilst the Director of Photography (DoP) and Director fleshed out the script and shot list. Once we had an initial creative vision we started to pre-visualize this in our Unreal mock-up of the studio exposing how creative ambition and technical fulfilment could, or couldn’t work in the space.
From day one we had a working mock-up of our final edit, something we were able to share, review and discuss with everyone in the team.
This then created a feedback loop, where we found we couldn't achieve a shot, we then adapted our treatment and updated our pre-visualisation with our refined approach. From day one we had a working mock-up of our final edit, something we were able to share, review and discuss with everyone in the team. It gave us complete transparency throughout the process; a means to ‘see’ what we were going to shoot as live-action before we were onset; completely derisking the shoot.
As well as our virtual set, we generated accurate digital twins of our physical props to test optimal set arrangement, giving the DoP, AD and Director as well as Unreal Engine artists a realistic view ‘through the lens’. The ability to go quickly and accurately from storyboard to pre-vis allowed the entire project team to understand what was being proposed. This created a culture of confidence which was echoed after the shoot, there were no surprises and no pivot moments that you sometimes get when you can’t accurately predict what the output will be.
With the narrative and futuristic tech-noir aesthetic set, everything else fell into place. We selected alt-pop artist July Jones to play Phoenix Jackson and she immediately identified with the character.
Producing in real-time
Despite all our test shoots, the film crew were entirely new to virtual production. We knew we needed a new workflow and comms on set to keep the team moving. There were different systems in play; this was no longer lights, camera, action, we had to check tracking, the scale of virtual and physical objects, we had the nodal points of cameras to align, then we had to blend lighting in the practical and virtual worlds even before we could frame up a shot.
Nailing the camera tracking in the space is key to VP. From the requirement to have tracking markers to lock onto, through to the reliability of the software and the size of the equipment you need to bolt to your already heavy camera. It really is the crux of where these two previously separate disciplines, film and games, connect.
A future of endless creative possibilities with virtual production
Working in real-time is extremely exciting. We’ve emerged from this workstream with a much deeper understanding and an internal defined approach for all the idiosyncrasies of working with virtual production. None of these workflows are really documented or out there, so this is key in-house knowledge to leverage for efficiency for our clients.
The breadth of content possibilities will be hugely efficient, carbon-friendly and we believe will produce more meaningful content. In many cases, brands will no longer need to ship products around the world for multiple location shoots. From the ‘clients’ perspective, Triumph Motorcycles was very impressed with how the Trident 660 looked against and as part of our virtual background content, and have seen the logic of the technology to provide opportunities to leverage and reuse their marketing assets pipeline.
Virtual Production is not only a new way of creating content, it’s a new way of thinking about content altogether.