Laura Frank is the proprietor of Luminous FX, a company that specializes in dealing with getting the properly formatted media to the proper projectors or LED walls at an event. In these days of having a dozen or more surfaces on one stage covered with pixels, she has created a niche’ for herself. While she is not the only person in the entertainment business tasked with her job, it’s the unique way her company operates that sets her apart from the rest of the world.
When we met years ago, you were a highly sought after lighting programmer. How did you get started in this field?
Laura Frank: I got the theater bug while in high school down in Texas. But I paid for college by working at the theater for the small Marlboro College in Vermont. I wanted to study Physics and Theater, but the funds were not there for me to stay at a private school. I had found out that the TD was leaving, so I approached the school. I told them I’d run the theater, build all the sets and manage the lights if they would give me credits and classes. They went for it and used the money they saved to buy an ETC Microvision console, my first lighting desk.
How did your love for physics tie into all this?
Lighting was the easiest thing in theater to tie to physics. Electricity and energy sources that made light are fascinating. I wrote this crazy thesis on Infrared Reflective Film Technology. I also considered using dance as a bridge between theater and physics. I ran a small dance company made up of students. For many of our shows, I set up all the lighting, ran cues during the show as well as danced in a few pieces.
Eventually you landed in NYC, looking for a gig
I walked to the Joyce Theater (a small dance theater located in Chelsea area) one day and ran into someone I knew from an internship at Jacob’s Pillow summer dance festival in the Berkshires [Becket, MA]. I managed to get some freelance work there for a while. Eventually a colleague introduced me to someone at Vari-Lite; they were just opening up a NYC office around 1993. I got an interview and started doing freelance work setting up Vari Lite rigs and running lights for about 10 years.
Somewhere along the line you went from programming lighting into the video field. How did that happen?
I was really motivated by the arrival of the Icon M fixture from Light and Sound Design. It was the first moving light that was a moving projector. This morphed into the M-Box media server eventually. I thought to myself, if this is the direction I want to go in my career, this digital lighting stuff, I had better teach myself the language of graphics and video.
So when did you first start working with video elements?
It was 2000. I had a Catalyst system and projectors with DMX-controlled mirror attachments. I was in Grand Central Station with WorldStage and I was programming moving images on the ceiling for a holiday display.
So you don’t run lights anymore, but you still use a grandMA2 console to control media servers?
I do. The last time I ran lights I was controlling a bunch of Jarag fixtures. I pretty much bitmapped them for the show and treated them as video surfaces.
So what exactly do you call yourself these days?
For lack of a better term, I am calling myself a Screens Technical Producer. My job is to come in and engineer the workflow between content creation, server programming and signal transmission. I ensure everything media wise on an event goes smoothly between the teams responsible for making pictures land on the set for whatever the video displays may be. On some jobs, I come in as a Screens Producer and also manage content creation.
What does that job entail?
The job entails a variety of tasks. Initially, I review the scenic design to analyze all the video surfaces. From there, I have two goals. First I need to create a content production workflow to produce the delivery files I need for video playback. Second, I need to define what the video signals paths will look like in discussion with the video engineering team so they have the a clear understanding of the gear necessary to process all the signals going to the set. I create a full video delivery specification with templates and reference images so that all teams are clear on the process. Depending on the number of video surfaces, the complexity of the set and the number of teams delivering content for the show, this can take a week or more to set up properly. But I have found, if the spec is well thought out and communicated, my job of programming screens on site becomes quite easy.
One of your job responsibilities is to collect all the media files ahead of time and make sure they are all up to snuff with your standards?
Absolutely. There is nothing more time-consuming than having to re-render or reconfigure files on a job site when time is critical. To help prevent errors on delivery, I have to think creatively about what the simplest and most logical delivery file looks like to the content designer. I don’t want the design teams concerned with minutiae of the sizes and shapes of the individual screens on the set. I want them painting the set with media using a toolset that allows them to think about creative, and not the technical details of delivery.
You typically know what media server you are using ahead of time. So do you need to connect with these people ahead of time and make sure they are providing content in a format that is acceptable to you?
Everything has to be to my spec. Of course, I can do some things to improve the imagery to suit the surface it’s played on, but if it comes to me in the wrong frame rate or codec, it goes back in the hopper. They have to send me the corrected files. This is one of the reasons people hire me prior to arriving on site. Sometimes I have 23 acts and only two and a half days of rehearsals before we go live to air. I will have received five to seven large delivery files by then. I would never sleep if I had to re-render content on site.
How much actual content do you get before one of these shows?
I base my calculations on the size of an HD frame of video. Each frame is 1920 x 1080 worth of pixels, or 2.8 million pixels. I can calculate the render load of any show as a multiple of an HD frame. So if I know it takes 12 million pixels to cover a given set, I can tell the content designer each rendered frame is 4.25 x HD.
How do you crunch all these numbers?
Spreadsheets! I calculate based on the native size of a show’s displays as well, and what I call the “unified” display size. Often a set will have a combination of projection and multiple types of LED. LED display products have different levels of pixel pitch, from 50mm to 3mm, depending on the product. To keep all content looking the same size across all surfaces, I pick the highest resolution display in the system and unify all the surfaces to that pixel pitch. Let’s say I have a set with 5mm, 9mm and 25mm LED. I have a list of native display size. That’s the size I’ll want all the content delivered to. But I will also have a list of the unified display sizes, as if all surfaces were the 5mm product. That’s the size I want content designed to.
In her home studio, Laura utilizes Cinema 4D software to build a simulated stage on her computer display. All of the screens and pixelated surfaces are displaying live streaming video. I let her explain the process she goes through prior to an event —ed.
It’s a four- to five-day process, right there, to set all this up perfectly to spec. I take the set designer’s Vectorworks or other 3D file and import it directly into Cinema 4D. I spend a day or so cleaning up the scenic model so it’s really lightweight, as far as the number of polygons and vertices that are in the 3D drawing. This allows me to use the model in my media server as well as port it to the web. I also spend time UV mapping this scenic model to the video files that will be delivered.
UV mapping is a CAD process that applies a 2D texture onto a 3D object. The texture in this case will be the media content itself —ed.
That UV-mapped model, in conjunction with the delivery spec I’ve defined, allows content designers a way to previz their video content. It’s one of the many previz options I supply. Another of the services Luminous FX provides their clients is a real time view of their stage set with their chosen media playing back on the scenic screens. This is sharable via a web browser.
How do you communicate what you are seeing in your office with your clients?
One of the things I’ve branched out to is WebGL (Web Graphics Library). WebGL is an open source standard for communicating 3D information through a web browser. This tool can be used to send a link to anyone, and in his or her web browser; they can have full 3D interaction with a scenic model I’ve prepared. Their content will be displayed sized just as it will be, live, during the event.
How does this aid them as well as you?
The wonderful thing about WebGL is it puts the power of 3D interactivity into a piece of software everyone already has on their computer, the web browser. If you can open a PDF file, you can enjoy exploring the link I send you with my delivery specifications. I find 3D to be an information rich experience that everyone should have access to before we are in the stress of production. You shouldn’t have to be a rocket scientist with expensive software to benefit from advantage of previewing a set in 3D. WebGL democratizes 3D for everyone’s use.
For example, I can use WebGL to set up camera presets in the 3D scene. If it’s a televised shoot, I find out ahead of time where the various camera shots will be coming from. The client can then watch the media being displayed on the set while they punch through different camera presets. They can see if the proposed video content works for the camera at that angle, ahead of time. They may decide there is a set piece in the way that needs to move a few feet, or a performer needs to stand in a better position for the camera shots to all work best with the media that is playing.
Can lighting designers sign in and see what the video content will be for a specific song and how it will look on all the elements ahead of time? Can you turn lights on in this program?
We can’t turn lights on….yet. But as far as lighting designers signing in to see what media will be displayed ahead of their programming — that is certainly an option, and this is a simple, great tool to use to figure out color schemes and set the tone for lighting a segment of any show.
Have clients been contacting you and requesting these video previz services?
Absolutely. I incorporate it into my specs on all the shows I work on now. I want people to get used to the idea that this tool is even available.
So in the end, using Luminous FX on a job accomplishes two things. The client never has to worry about any of the media not being formatted or sized correctly when they arrive on site. The other being that the client can log into your site and see what their footage looks like at any point while playing on their set.
That is correct. My job is to eradicate anyone potentially saying, “I didn’t know it was going to look like that!” I want to strike those words from the production vocabulary. And I also want to make a really good-looking show.
For more information, visit www.luminousfx.com.