Realtime Effects Dev

HighresScreenshot00008.png

Effects tend to be the most processing-heavy and render-heavy aspects of VFX- things like fire, water, destruction and magic. These are often very complex systems, with very complex solutions. And in non-realtime rendering, there is very little limitation on what you can do or how you can do it. The quality of the final image is the absolute priority and studios are willing to tolerate massive files and massive render times to come out on the other end with a beautiful picture. Through working in film for the past decade, I have amassed a huge amount of experience in dismissing efficiency as the least of my concerns. And as I began to tackle the process of creating effects for VR, things that need to run in real-time and be renderable from 2 angles simultaneously in real time, I knew I would not ultimately be able to work in the way I was used to.

It made the most sense to me to approach developing the look of the effects without restrictions in Houdini and then attempt to figure out how I could best approximate that look in Unreal. This resulted in a lot of false starts, but ended up teaching me a lot and ultimately driving the concept of the effects in a new direction at the convergence of my film-based process and the reality of what is achievable.

THE VOID

Our biggest effects focus is “The Void” - a disembodied, eerie environment that the viewer returns to throughout the story. The concept started with reference of Thomas Wilfred’s “Lumia,” photographs of light refracted through pieces of colored glass that builds structured, nebular shapes.

ThomasWilfred01.jpg
ThomasWilfred03.png
ThomasWilfred02.jpg

To begin with, I built delicate volumes out of lines advected through a simple smoke simulation, so I could have complicated organic shapes, but also keep a crisp structured feel to the end result.

Visually, we were happy with the result but it pretty much goes without saying that getting these volumes into Unreal turned out to be a huge challenge. Ryan Brucks is an epic games developer who has done some amazing work creating a volumetric ray-marching material in Unreal. The Houdini Gamedev toolset has a Volume Slice tool which interfaces with Ryan’s plugin (it essentially cuts the volume into tons of thin slices along one axis and bakes them into a texture that the UE material uses to rebuild the volume.) While this workflow was successful to some extent, I found it impossible at this stage to get the high resolution of volume that I needed for the effect to be successful, and on top of that, the paradigm of how these volumes get rendered in Unreal falls over as soon as the viewer enters the bounds of the volume. (I was recently told that this problem is solved if you reverse the normals (?) This confuses me BUT is exciting if true. I haven’t had a chance to try it out yet.)

At this point, it made the most sense to use geometry to build the shapes and then focus on rendering them in a way to give the illusion of being volumetric. Again, through the Houdini Gamedev toolset, I was able to export soft-body vert animation of these shapes moving with a sine-wave deformation in a loop. Vertex animation is a process by which the vector positions of vertices are baked into a texture, and an Unreal material can use this texture to look up those positions over time. As far as effects go, this is a cheap one because texture lookups in Unreal are relatively efficient. I’ve begun to realize that in order to do anything cool, vert anim is the most reliable option.

As I began rendering these shapes with transparency and emission, trying to get that Thomas Wilfred light quality, I saw that both those things can really flatten out an object in VR. Somehow even though I was seeing them in a 3D space, I wasn’t believing that they were three dimensional because they didn’t have lighting that was physically based in any way. I have developed tons of magical effects in my career that are both transparent and emissive, without realizing that part of how they were successful was the fact that the image was 2D and the lack of realism in the lighting was somehow less noticeable.

These particular void effects were a lot more foreboding and concrete when they were rendered with an opaque shader and lit without emitting light themselves. This led us down a pretty different path than we had originally intended, but we are really happy with where it ended up. I started creating these stringier forms that feel organic and it pushed our concept into feeling more like the inside of a womb, which resonates with the themes of the film.

One of the biggest challenges to doing effects in realtime is sifting through all of the philosophies out there about best practices. Many artists are coming at realtime effects from a rigid games perspective, where their approach begins with the technique and the visual is limited by that – or, conversely, from an expansive VFX perspective where their approach begins with the image and then they have to severely compromise their vision in order to wedge it into the game engine.

Both approaches are valid in certain contexts, but for something like cinematic VR, it seems to me that the games approach to effects – which is often based on sprites (2D images on cards) – is never going to get the 3D look that rendering from 2 eyes in VR requires. In a certain way, I find my ignorance about game effects useful. It allows me to begin by operating with no restrictions. And my deep knowledge of effects allows me to use many different tactics to approach a target that isn’t necessarily fixed - we are just trying to figure out what is achievable and make it as beautiful as possible.

Stay tuned for more vert anim…

Sonya Teich