Imagination prides itself with the quality and craft of its virtual production processes, and once again the team has gone the extra mile with a new campaign in partnership with Epic Games.
Powered by a MegaGrant from Epic Games, Imagination employed Unreal Engine to create a sci-fi dystopian world, pushing the boundaries of real-time virtual production. The use of the engine enabled Imagination to truly showcase the power of Unreal in creating photorealistic sets in CGI, and the result is the spectacular spot which you can view below.
Today we are getting Behind the Idea following a chat with Tony Currie, Creative Director at Imagination, to learn more about a campaign with great production value.
What was the brief?
After receiving an Epic MegaGrant from Epic Games, we fully embraced real-time virtual production, proving that epic results can be sustainably achieved with little space and manpower. With our expertise, brands can use virtual production to unlock a new world of creative possibilities, offering their audiences an enriched experience across all touchpoints.
Maximising the Unreal Engine's capabilities to blend the physical and digital worlds by combining live-action footage and computer graphics in real-time—enabling brands to create cinematic live, or as-live, content indistinguishable from the real world. The result was a revolution in content production which will meet brands’ growing demand for richer and more engaging content and experiences, while removing barriers around quality, cost, and turnaround time.
As a demonstration of its virtual production and brand storytelling capabilities, we created a cinema-quality teaser featuring the new Trident 660 from Triumph Motorcycles as an example.
How did the initial pitch/brainstorming phase go?
From the outset, we knew we needed to define a workflow enabling us to prototype our virtual set quickly, place it in the physical space set, test the limitations of our hardware and then refine / learn from the outcome until the point of actual shooting. Basically, build, test, learn, refine.
Initially, we started tech visualisation, reviewing exactly how Epic’s small stage would influence and determine what we could film. We generated a millimetre accurate Unreal mock-up of the 5x5m stage, whilst the Director of Photography (DoP) and Director fleshed out the script and shot list. Once we had an initial creative vision we started to pre-visualize this in our Unreal mock-up of the studio exposing how creative ambition and technical fulfilment could, or couldn’t work in the space.
This then created a feedback loop, where we found we couldn't achieve a shot, we then adapted our treatment and updated our pre-visualisation with our refined approach. From day one we had a working mock-up of our final edit, something we were able to share, review and discuss with everyone in the team. It gave us complete transparency throughout the process; a means to ‘see’ what we were going to shoot as live-action before we were onset; completely derisking the shoot.
As well as our virtual set, we generated accurate digital twins of our physical props to test optimal set arrangement, giving the DoP, AD and Director as well as Unreal Engine artists a realistic view ‘through the lens’. The ability to go quickly and accurately from storyboard to pre-vis allowed the entire project team to understand what was being proposed. This created a culture of confidence which was echoed after the shoot, there were no surprises and no pivot moments that you sometimes get when you can’t accurately predict what the output will be.
With the narrative and futuristic tech-noir aesthetic set, everything else fell into place. We selected alt-pop artist July Jones to play Phoenix Jackson and she immediately identified with the character.
Despite all our test shoots, the film crew were entirely new to virtual production. We knew we needed a new workflow and comms on set to keep the team moving. There were different systems in play; this was no longer lights, camera, action, we had to check tracking, the scale of virtual and physical objects, we had the nodal points of cameras to align, then we had to blend lighting in the practical and virtual worlds even before we could frame up a shot.
Nailing the camera tracking in the space is key to VP. From the requirement to have tracking markers to lock onto, through to the reliability of the software and the size of the equipment you need to bolt to your already heavy camera. It really is the crux of where these two previously separate disciplines, film and games, connect.
Tell us more about the concept. How did it come to life, and why was it the right choice?
Viewers follow the story of a female protagonist Phoenix Jaxson—played by alt-pop artist July Jones—as she solves crime and delivers justice in a dystopian world. The video highlights the creative possibilities of virtual production by taking the audience on an action-packed journey through photorealistic urban environments, spanning day to night, which finally breaks the fourth wall by revealing the set and film crew behind the production.
We recreated a highly complex cyberpunk world with scenes resembling the cinematic landscapes of film directors Zack Snyder and Ridley Scott. In fact, the video’s opening scene is a nod to Bladerunner and highlights the cinematic possibilities.
The intricate urban landscapes reimagined going from day to night demonstrates the creative control and in-the-moment adjustments virtual production allows and solves the common challenge of needing to chase the light in traditional production.
The team has been able to deliver ambitious results with commonly tricky objects—such as reimagining the reflective surfaces of Phoenix’ sunglasses and her Triumph motorcycle—seamlessly blending these with the virtual world for a photorealistic and visually stunning end result.
What was the production process like? What was the biggest challenge?
A small team was assembled from three core specialisms (creative, technical and production) and together we went on a journey to learn as much as possible. Using our test lab at Store Street as a proving ground, leveraging existing real-time assets we had created, adapting those environments for a virtual production pipeline. We tested studio lighting, cameras, object tracking, LED and OLED screens. Then we got under the hood of the nDisplay pipeline, exposing ourselves to genlock, framelock, screen sync and the interrelationships of networking, content creation and hardware that go into delivering final pixels to camera. At the end of the process, we had our first 10 seconds of fully VP test footage.
We then identified six considerations and opportunities to steer the creative process further. The shoot would take place in the Epic Innovation Lab in London:
- Epic’s XR stage was 5m x 5m and not set up for a commercial shoot.
- Triumph offered to lend us their new Trident 660 Motorcycle.
- Most VR examples used desert scenes.
- The light from the LED is a huge advantage over greenscreen.
- In studio-based filmmaking, reflections are difficult and typically rely on post-production.
- The advantage of a games engine is world-building and interactivity.
So with these key considerations in mind, we set off to create a narrative that would tell a unique story of virtual production and show off some of the real advantages of using this technology. We wanted the story to be punctuated by several powerful moments:
- Go big via a small XR stage and create a punchline in breaking the 4th wall.
- Create a cinematic opening that the audience would mistake for the start of a feature-length film, all whilst on a tiny set.
- Build a unique world set in the future and flood it with both natural and neon light, to demonstrate how adaptive and responsive the technology is.
- Show the Triumph Motorcycles bike as a physical prop but use a digital twin to give depth.
- Show reflective surfaces such as sunglasses, textiles, puddles and metallics to make a strong case in point for the advantage of little to no post-production.
What is one funny or notable thing that happened during the production of the campaign?
Calling over the technical artist operating the Unreal engine and asking for a bridge to be moved back a few blocks, the sun to be lower down closer to the horizon and a puddle to be more reflective. The crew had a chuckle at the realisation of how flexible and quickly changes and creative direction can be implemented. In no other medium could you behave that way. Only a few years ago the thought of making such a statement on set would be crazy.
What’s the main message of the campaign and why does it matter?
The way we create and craft content is changing. There is a new tool in the toolbox and it's not just for the few. Yes, a huge hanger packed with LED will give you the breadth of a hollywood studio but you can also achieve scale and cinematic quality on a much smaller setting. This isn’t a silver bullet but it will offer a truly sustainable and post production saving alternative to many types of shoots. As a Creative Director you have an enormous amount of freedom - to literally build new worlds where anything is possible or replicate existing scenarios that can be reused an infinite amount of times. The technique will bring you closer to other disciplines as you work hand in hand when crafting the content. Stakeholders will have a much clearer idea of what they are signing off before a single shot is captured offering full transparency across the entire pipeline.
What is one unique aspect of the campaign?
The teaser video was filmed in a 5m x 5m studio in London with a crew of only 13 people, demonstrating that epic cinematic results can be achieved with little space and manpower.
Using real-time virtual production negates the need for multi-location shoots, allowing you to continue shooting if a product doesn’t exist or is delayed, and therefore saves time, money and reduces the shoot’s carbon footprint. Plus there’s less risk of exposing sensitive information or embargoed products on location. The content can also be used to create multiple assets across different mediums and customer journeys; such as AR, VR, online and even in print.
How long did it take from inception to delivery?
Running this entire project during the global pandemic was challenging. We faced unpredictable timings and had to allow for all the typical face to face meetings to happen behind screens. That said we probably spent about 12 weeks on the project with a large chunk of that time allocated to learning and discovering - which you wouldn’t get on a traditional client shoot.
What do you hope it achieves for the brand?
Working in real-time is extremely exciting. We’ve emerged from this workstream with a much deeper understanding and an internal defined approach for all the idiosyncrasies of working with virtual production. None of these workflows are really documented or out there, so this is key in-house knowledge to leverage for efficiency for our clients.
The breadth of content possibilities will be hugely efficient, carbon-friendly and we believe will produce more meaningful content. In many cases, brands will no longer need to ship products around the world for multiple location shoots. From the ‘clients’ perspective, Triumph Motorcycles were very impressed with how the Trident 660 looked against and as part of our virtual background content, and have seen the logic of the technology to provide opportunities to leverage and reuse their marketing assets pipeline.
Virtual Production is not only a new way of creating content, it’s a new way of thinking about content altogether.
Credit list for the campaign?
Opening Title Logos
- Territory Studio
- Epic MegaGrant badge
- Tony Currie - Creative Director
- July Jones
- Paul Marsden - Executive Producer
- Greg Hobden - Live Action Producer
Director of Photography
- James Medcraft
Unreal pipeline and XR stage management
- Simon Levitt - Creative Technology Director
- Allandt Bik Elliott - Technical Lead
- Jamie Shilvock - Snr Technical Artist
- David Bailey - Technical Artist (pre-vis)
Hardware & Networking
- Tom Pitt Chambers - Head of Production Technology
Scenic Production Design
- Anthony Neale - Art Director
- David Spiers - Production Director
- Tom Ribot - Art Director’s Assistant 1
- Fingal Green - Art Director’s Assistant 2
- Michael Appleby - Master Carpenter
- Martin Swann - Editor
- Navide Apicella - Producer
- Ella Cade - Production Manager
- Nick Butler - Set Crew
- Matthew Bennett - Set Crew
- Paul Barton - Cameraman (Behind The Scenes)
- Morgan Spencer - Focus Puller
- Leopole Naessens - Gaffer
- Jac Hopkins - Grip
- Jeff Celis - Electrician
- Tyler Sinclair - Electrician
- Finn Sheriton - Digital Imaging Technician
Hair and Makeup
- Ellie Bond - Hair Stylist
- Charlotte Fitzjohn - Make Up Artist
- Crispian Covell - Sound Design, Mix and Foley
- Realtime scenery by Territory Studio
- Realtime Trident 660 by In2Real
- Rae Ashurst
- Time Based Arts
- Somebody Like Me written by Warcub
- Ross Wheeler - Global Business Director
- Cassandra Harris - Client Services Director
- Anton Christodoulou - Chief Technology Officer
- Helen Lawrence - Global Head of Marketing & PR
- Simon Beddoe - Head of Business Development
- Clare Johnston - Business Director
- Triumph Motorcycles
- Epic Innovation Lab London
- Static Lights
- Dan Burgess
- Tom Burford
- Mark Williamson
- Glyn Williams
- Jayne Robinson
- Jo Holley