Local team tries its hand at big budget visual effects

Charlie Christensen, the director of the film research project, hopes to see if virtual reality will be a viable option for independant films to shoot special effects.

After Disney and Industrial Light and Magic made waves with the Disney+ show The Mandalorian, Lethbridge College instructors and students decided to see if the same technology could be used for independent film makers.

The team behind the research project consists of instructors and alumni of the Digital Communications and Media and Multimedia Production programs from Lethbridge College.

Charlie Christensen, the director and a graduate of the MMP program, has been excited to try this new style of filmmaking ever since he discovered how they filmed the latest installment in the Star Wars Universe.

“Realizing what kind of door that’s opening to film makers to not have to travel the world to go to different locations, where everything can be done in one place and you can, you know, place your actors in any kind of situation that you want, I thought that was pretty incredible,” said Christensen.

The research has revolved around filming a short film the traditional way, then shooting with the newer method to create backgrounds in real time using a free software called Unreal Engine: a real-time 3D game engine used to create popular video games.

DCM students Devin Carroll (right), Rochelle Sciorino (left) and Rhett Ripplinger (holding camera) get ready to shoot a scene with actor Day Chase (middle).

Traditionally visual effects or VFX, are done by filming actors in front of a monochromatic screen, usually blue or green, so a new background and other characters can be added later in post-production.

This type of filmmaking is effective but requires many hours of work after filming to make the environments look believable. Unfortunately, this gives the actors nothing to react to, because they can’t see anything but the screen.

During the filming of the 2019 version of the Lion King, Director Jon Favreau and his production team embraced the idea of using video game software to create virtual backgrounds they could film, in real time.

This essentially made what was once post-production VFX now pre-production. All environments had to be created before filming so they could be shown on a screen while the actor(s) were filmed.

This idea was then further refined on the production of the Mandalorian. An entire set was made with video screens for walls and ceiling so a character could be placed in a virtual environment in real time. This set was dubbed The Volume.

For the Mandalorian this meant they could, for example, reflect the environments from the LED screens on the character Mando’s silver armour. This is something that would be much harder to do traditionally with green screen in post-production.

It also meant they could set up an ideal virtual environment and lighting at any time, day or night, regardless of the weather.

“I mean if you understand what kind of environment you want to shoot in, you know it only takes a little bit of scene prep and building a little bit of practicals and then you’re not worrying about clouds and rain and wind and all of that stuff,” said Christensen.

Unfortunately working out how to get the new system working has been challenging.

“It’s a steep learning curve. It’s very tech heavy and if the tech is working, it’s great,” said Christensen.

Allyson Cikor an instructor in the MMP program has a background in video game design and was able to offer some insight on the tech side of how to make this project work.

“My background is primarily in game development, mostly as an environment artist, so I make a lot of the props and the scenes that you would see in the world of video games as well as in virtual reality,” said Cikor.

To get things to work has taken a lot of trial and error as this technique is still new and there are not a lot of people at least in southern Alberta that know how to get the different bits of gear to work together.

“A game engine is for making games and VR hardware is for playing around in VR and we’re really trying to merge all of this stuff and make it do things it wasn’t necessarily designed to do,” said Cikor, “it’s been a struggle.”

As the technology becomes more powerful and cost effective it should hopefully allow smaller and lower budget productions access to more Hollywood-style VFX.

“Well, the process itself I think needs to be streamlined and I think it’ll move that way as the industry adapts the technology. The use case is going to push the improvement of the software and the hardware and everything in between and I think it’s really proven itself as a technology, at least at the very high level. Enough that now we’re seeing Unreal, directly speaking to the virtual production community, developing that into their engine, creating a whole support team for larger studios around it. So, I think it’s definitely going to grow and improve,” said Cikor.

Thanks to COVID-19 and ever present technical issues, the film is still in production but should be completed by June.

Copyright © 2024. All Rights Reserved. No part of this website may be reproduced without written consent. Please contact digitalcomm@lethpolytech.ca for more information. We encourage all readers to share their comments on our stories, photos, video, audio, blogs, columns and opinion pieces. Due to the nature of the academic program, comments will be moderated and will not be published if they contain personal attacks, threats of violence, spam or abuse. Please visit our editorial policy page for more information.
Related Posts