Skip to content

An Interview with Matthew Ardine, Gaffer of Everything Everywhere All at Once

Share this Post:

Directed by Dan Kwan and Daniel Scheinert, (collectively known as “Daniels”), Everything Everywhere All at Once is a new action comedy from A24. The film occupies the rare intersection where an art house drama and a high action blockbuster can flawlessly converge, and is probably the closest A24 will ever get to making a Marvel movie. The multiverse is in full force, as an inter-dimensional rupture unravels reality, and our unlikely hero (played by Michelle Yeoh) must channel her newfound powers to bring it back into equilibrium.

PRG sat down with Gaffer Matthew Ardine to discuss his workflow for the stunning, fast-paced portal jumping scenes throughout the film, and his use of Mbox Studio+.

Can you tell us a bit about your overall experience in the film industry?
I have worked as a gaffer and lighting designer since 2005 after graduating Emerson College. I joined IATSE Local 728 in 2006. Since then, I have worked on a little bit of everything. This includes hundreds of commercials, music video, broadcast concerts as well as episodic tv and feature films. Sometimes I come into films to work as the LD for specific concert or musical scenes. Other times, I’ll do the whole movie as the gaffer, such as Everything Everywhere All At Once.

Please tell me about your role as gaffer on Everything Everywhere All At Once and what that entailed.
My job was to deliver the cinematographer’s vision. This was my longtime colleague and Emerson classmate, Larkin Seiple. Larkin did a great job describing to me the aesthetic of the various multiverses. We created a long list of all the lighting effects that we had to achieve. One of my unique jobs for this film was to come up with the plan on how to accomplish these lighting effects.

You posted a behind the scenes video and mentioned in the caption that it was shot in Paris but the setup was in LA. Can you tell me more about that workflow and how Mbox was used?
Anytime there is a screen on set, we are using Mbox attached to the lighting console so that we can use it as a manipulated light source.  Some of the simpler uses were streaming NDI to the cell phones so that we could not only see them on camera and execute cues remotely, but also use them as a lighting source that could be manipulated. In the first elevator scene, we needed the Alphaverse device to scan Michelle’s face. I used a little 3,000 lumen projector and Mbox to map the video to her face. In the Alphaverse van, we used two Mbox units to output to the eight discrete screens. Whenever a screen wasn’t on camera, we bumped up the brightness to light the actors’ faces.

Why was Mbox the right choice for this specific film/scene you showcased in your video?
After we came back from the Covid shutdown, we had about four days of additional photography to finish up the movie. One of the shots we needed was Evelyn (Yeoh) traveling through several locations in four seconds. To do this, we decided that building a tunnel out of LED wall and playing the actual footage on it would give us the best environmental lighting to cast on her face.

But lead actress Michelle Yeoh was only needed for a couple short scenes, so she decided to stay in Paris. The directors, [DP] Larkin, and myself were in LA filming the rest of the days. So instead of traveling, we decided to do a remote shoot. I sent the diagrams to a Paris company and they built the LED tunnel. I shipped them a Just Networking theBRIDGE and had one with me in Los Angeles. In LA, I had the [MA Lighting] grandMA3 Light and Mbox. In Paris, they had a grandMA3 and a laptop receiving the video output from my Mbox. They fed that into the LED processor. We also received the camera output in LA in our control room that we set up in the post house that was editing the film. So, we were sending MA-Net and video from LA to Paris with no discernible latency.

What’s the biggest benefit for you in using Mbox?
I chose Mbox because it allows us to program everything from the lighting console and not need to go back and forth to the media server UI. Plus, we can do a large amount of screens and pixel mapped LED’s for a very competitive price.

What was the biggest challenge for you with this project specifically?
When we did the previous film, Swiss Army Man with the Daniels, the budget was only $5 million so the expectations were pretty low for the size of the lighting setups we could accomplish. I’ve also done Spiderman: Far From Home additional photography with Larkin, where the budget was huge and so were the lighting setups. Everything Everywhere had a budget of $15 million which allows you to get the equipment for decent sized setups but not a fully staffed department. So, the biggest challenge was trying to achieve the lighting setups that the film needed with a minimal crew. Luckily, my best boy, Mike Beckman, is a jack of all trades and kept the department afloat everyday making sure that the current and upcoming sets were ready to go.

How long have you been using Mbox and what other projects have you used it on?
I’ve been using Mbox since the introduction of Mbox Studio, so around eight years. I have used it on tons of concerts and several commercials where I have pixel mapped hundreds of universes of pixels. It’s an awesome tool that has allowed me to achieve lighting setups that I could never imagine were possible 10 years ago.

Follow @Mattardine on Instagram to see more of his work. PRG shared this insightful interview content with PLSN. Learn more about PRG’s TV and Film work at: TV and film

Some behind the scenes photos:

Matthew Ardine (fourth from the left) with the lighting crew for Everything Everywhere All at Once