Skip to content

2022 is On its Way in Real Time

Share this Post:

An xR stage at Savannah College of Art and Design shows a scene running on Unreal Engine. Photo by Vickie Claiborne

Hi friends! I’m easing back into writing again after our columns were put on hiatus in March 2020, and it’s no small task to narrow down to just one topic! So, to get started, I decided to make a short list of new features and advancements in the top media servers that have occurred over the last year and a half. For most of them the biggest feature at the top of that list is the same: render engines and real-time generated content.

While 2020 brought the complete shutdown of the live events industry, the broadcast and film industries continued to charge on and even thrive. Virtual productions became for many projects the only viable option to continue, and many directors of photography and cinematographers began turning to gaming engines like Unreal Engine and Unity to provide highly realistic backgrounds used on virtual production stages. And I’m happy to be able to say, that as the live events world slowly recovers in 2021, these real-time generative content creation engines are now being incorporated into many live events, so their popularity continues to grow.

What are Render Engines?

Unreal Engine (unrealengine.com) and Unity (unity.com) are software-based gaming engines that require some major processing power to run at industry standard frame rates without dropping frames. These render engines are based on a visual scripting environment that allows you to connect ‘nodes’ that contain objects such as effectors and animators to create scenes. While using them proficiently does require some training, fortunately, mastering some of the basics for these is made easier with free e-learning courses and videos available online on their websites as well as on YouTube and other social media platforms.

A scene that is built using a render engine really just requires a computer in order to be played back. However, the drive to adopt the use of these types of scenes into the world of live events, film, and broadcast has pushed for the development of software tools and workflows that allow these scenes to be controlled via a third-party piece of hardware and/or software creating new opportunities for the integration of real time content in live playback.

It’s All About Control

There are several ways a render engine scene can be configured to be controlled via a media server, DMX console, OSC device, or other type of remote controller. Here is a quick look at just a few of the exciting ways for programming with real-time content:

Exposed Parameters

While creating a scene in a render engine such as Unity and Unreal can take some time to learn, the workflow for controlling Unreal and Unity scenes via a media server is actually very similar to controlling attributes of a piece of pre-rendered content; the scene can be configured in a variety of ways that make it possible for the operator to control attributes like brightness and color directly from the media server or other remote controller like OSC or DMX. The individual attributes of an object such as a prop or a camera can be exposed and assigned to a specific ‘stream’ which then makes it a controllable attribute from within the remote controller. If that controller is a media server such as disguise, then the properties of the exposed attribute can be configured using keyframes. If the remote controller is DMX, then a lighting programmer will use a channel to control the object.

Sequenced Animations

It is also possible to create an animation within the render engine scene and then control that animation remotely. For instance, within Unreal, an attribute of an ‘actor’ such as its position can be assigned to a timeline and then sequenced; then, from within a media server, this sequence can then be controlled in the same way as a pre-rendered piece of content using keyframes or DMX values. Please note: Unity has a similar feature known as Time Control and can be configured to work in a similar way.

Levels

Scenes created in a render engine can also contain multiple ‘levels’; these levels allow individual objects within the scene to be assigned to unique levels. Once assigned, you can select between the levels from a remote controller making it possible to switch between looks without having to load another scene.

Virtual Lighting

Lighting is another element that can be added into a render engine scene and then be controlled via a remote controller such as a lighting console. This is often seen on virtual production stages with LED lighting that has been added both in the virtual scene as well as on the stage to reinforce the appearance of a seamless environment between real and virtual worlds. A plugin for Unreal created by Imaginary Labs called Carbon (carbonforunreal.com) takes this concept a step further and has developed the first visualizer built on a render engine. This may be the early days of the next phase of the lighting visualizers of the future, bringing the worlds of lighting, video, scenery, automation, and effects even closer together while moving toward a single integrated pre-visualization tool.

Multi-User Editing

In situations where you have multiple programmers who need to work on the same project at the same time, it is also possible to configure a session wherein multiple instances of a render engine project are running, enabling multiple users to work on the project at the same time. This concept is similar to the way lighting programmers work on a network of consoles while being connected to a single project. This allows one user to work on a specific area within the scene while another works on another area, making changes and updating while the session is running.

Content rendered in real-time can be used in all the same ways as pre-rendered content, such as for scenery and set backgrounds (backplates), lighting effects such as pixel mapping, and virtual elements such as AR and xR (front plates and virtual set extensions), with the added benefit of being uniquely rendered in real-time during playback. Gaming engine technology has allowed real-time rendering speeds to improve to the point where playback can sustain 60 FPS and higher frame rates on aptly suited machines, opening the door to fantasy worlds full of realism that are ready for exploration to be incorporated into all aspects of entertainment. While this technology has already made a huge impact on the film and broadcast industry, it is certain to have a huge impact on live events as well. So get ready to ride the real-time wave into 2022 as live events continue to recover!