Extended Reality Group Optimizes Content and Unreal Engine Workflow for Historic Resorts World Las Vegas Campaign

In April, the new US $4.3 billion, 3,500-room Resorts World Las Vegas luxury destination launched its “Stay Fabulous” campaign with an immersive, groundbreaking short film featuring Celine Dion, Carrie Underwood, Katy Perry, Luke Bryan, Tiësto, and Zedd. Created by Hooray Agency and Psyop to build excitement for the resort’s recent grand opening and convey the astounding entertainment in store for guests, the agency and production partners famously pioneered some revolutionary workflows. Among the production’s key technology vendors was the Las Vegas-based virtual production and creative solutions studio, Extended Reality Group (ERG).

Marketing Technology News: Salesforce Announces Issuance of Inaugural Sustainability Bond

Launched last year by Founders Evan Glantz, Zack Kingdon, Keith Anderson, and Patrick Beery, ERG’s phenomenal roster of artists, integrators, and production experts pioneer new applications for XR, MR, AR, and VR, extending reality in new dimensions. While most of ERG’s first collaborations remain confidential, its reputation is soaring in the world of cutting-edge virtual productions.

In Resorts World’s campaign’s credits, ERG lists Glantz, Kingdon, CG Supervisor Darrion Granieri, and Senior Unreal Artist Patrick Beery with their company, under Unreal Engine (UE) Workflow Optimization and Creative Services. For the record, their project talents also included Virtual Lighting Director Alvaro Turino Grosso, Senior Technical Artist Warrell Andrew, Project Manager Andrea Frey, and a group of highly skilled animators that extends around the globe.

Building Virtually Tracked Content

Supporting the opulent vision of Psyop’s Director and Founding Partner Marco Spier and his colleagues, ERG’s assignment began by providing asset optimization of geometry and texture for LED volume production, based on models provided by Psyop. “In direct collaboration with Marco, we were tasked with doing all the scene building, lighting, and final touches,” Glantz began. “We were responsible for optimizing, and in some cases, creating, the content appearing in the volume, which is the space where the virtual-tracked content is presented for motion capture. In other words, the volume is comprised of the LED screens where the Unreal Engine technology renders all the complex visual effects shots in real-time during the live-action shoot.”

Drilling further into cinema magic widely hailed as the future of production, ERG’s maestros began with models created in Autodesk Maya, Maxon’s Cinema 4D, SideFX’s Houdini, and Adobe’s Substance Designer. Those assets were then integrated into the UE environment volumetrically, where optimization largely focused on designing for geometry.

According to Beery, “This project’s high number of animation and lighting elements interacting together required a tight polygon budget to render everything at an acceptable frame rate. Our workflow allowed us to optimize and re-topologize these models to work well within Unreal, and required us to model each component and environment accordingly, to meet the industry standard for motion capture.”

While this artistry resulted in the creation of the vibrant worlds where the campaign talents seamlessly appear in the finished brand film, there is more to know about the process of making the content interactive, to serve the filmmakers’ needs during production.

Marketing Technology News: MarTech Interview with Dayle Hall, CMO at SnapLogic

Hybrid UE Workflow for Real-Time Visual Effects

Following Psyop’s aim to leverage talents and technologies behind the award-winning Disney+ series “The Mandalorian,” this campaign’s physical production took place at LA’s Nant Studios, and involved acclaimed director of photography Matthew Jensen, ASC.

As part of its pre-production services, fully understanding the potential for massively impacting in-camera virtual production scenarios, ERG created and programmed several different “baked” lighting scenarios for instant activation on the set. These addressed factors like various positions for the sun, and incorporating effects such as “lazy God rays,” with appropriate shadows.

“This approach of preparing baked lighting workflows saves a huge amount of time on set,” Turino Grosso explained. “In settings like this, it allows us to give the director and DP content that is consistent across all playthroughs, which can be invaluable for blocking and rehearsals.”

Anticipating on-set demands for dynamic interactive lighting, Turino Grosso programmed full CUE to CUE and in-engine control via DMX. This allowed him to run both UE environmental lighting and physical lighting on the set using a GrandMA3 lighting console. The results include the ability to make on-the-fly color adjustments and vary the speed of the pinball’s roll, among many others.

For ERG, Turino Grosso, Beery, and Granieri actively participated in the production, providing quality control during the rehearsals and shoot, in concert with executives from Nant Studios and Epic Games.

“Along with Virtual Production Supervisor Lawrence Jones, Marco gave us an amazing set of references and concept art that helped us understand the goal of telling a dynamic story blending classic Las Vegas with the newest technology,” Kingdon added. “In this case, the use of bold physical set pieces to bend the line between photoreal and surreal elements – what we call photo-surrealism – worked like a charm. We were honored to be part of this epic, groundbreaking campaign, alongside the world’s premier talents.”

Marketing Technology News: MarTech Interview with Adam Sharp, CEO and Co-Founder at Clevertouch Marketing

Brought to you by
For Sales, write to: contact@martechseries.com
Copyright © 2024 MarTech Series. All Rights Reserved.Privacy Policy
To repurpose or use any of the content or material on this and our sister sites, explicit written permission needs to be sought.