The Concept
Noting that Joel Zimmerman at William Morris was looking to transform Avicii into a “proper rock star,” lighting and production designer Alex Reardon started working on a unique concept for a DJ booth: a 17.5-foot-high “head.”
The cranium is actually a tricked-out gondola equipped with jump seats, drink holders, built in beverage cooler, fans, lights and, of course, all of Avicii’s DJ gear. As Reardon says, “we made sure that if our spec is to make him a rock star, then his DJ booth had to be the MTV Pimp My Ride experience.”
Automation and fog effects lit by 12 Impressions are activated when the head “lifts off,” and three POV I-Mag cams are also rigged to the gondola — two beside Avicii and one underneath — so as he hovers above the audience, images of the audience can be projected on the walls.
The gondola is also equipped with lighting fixtures — Color Kinetics ColorBlasts and two Vari*Lite VL3000 Spots, which work with two Martin MAC 700s positioned upstage to keep the DJ lit for the cameras as he moves. There’s no need for followspots, and the fixtures are positioned to eliminate any spill that might interfere with the projections.
The Head
The head was modeled and fabricated by Calgary-based Heavy Industries to the precise specifications of the engineers at PRG Scenic in Las Vegas.
“We hired Heavy Industries to do a 3D model of the head, and to do the cutting and make all the parts,” notes Paul Kirkpatrick, the lead engineer from PRG Scenic on the project. “They cut it on a 5-axis CNC router, and we told them the details of how we had to have the armature built. Then it all came here and we put it all together in our rehearsal space.”
Coated with Screen Goo to optimize projections, the head was not modeled on Avicii or anyone else. “It’s completely random,” Reardon notes. “People have asked me, ‘Why don’t you put eyes in there,’ but I’m not that keen on that idea, because this head has to be ‘Everyman,’ and as soon as you put eyes in a head, then it’s going to be its own character.”
Reardon wasn’t satisfied with just having a tall 3D object onstage by itself, which, from a distance, might just appear as “a lump in the middle of the stage.”
The answer was to “go wide; I wanted the big letterbox thing. So then I thought about leaves, but I wanted to mess with people’s sense of perspective. We’re so used to looking at a TV screen that tells us where the edge is. We use that edge to sense distance, but that hardness of the edge of the screen is an end-stop. So I thought, ‘I wonder if you can get a front projection screen with holes in it.’ Lots of people have done the low res projection in front of high-res backdrop to get that perspective, but I wanted to do projection.”
Reardon consulted with PRG Scenic team and project manager Peter Wessel and Bill Brouwer, GM of PRG-LV’s Scenic Technologies, and they found a perforated screen that was exactly what he wanted.
Projection and LED
Early on in the design process, Reardon wanted to wrap video around the head, animating the contours with a wide array of mapped looks and effects. “I wanted to go with projection (because) LED is brutal to look at; it’s harsh, it’s not soft, warm, or organic.”
Along with projections on the head’s surface and flanking leaf elements, the upstage portion of the Avicii touring set adds visual punch with a 60-by-24-foot wall made from PRG Nocturne 18mm LED video panels.
Vidaroo, led by Ian McDaniel, provided the custom content projected onto the head and leaves, with as many as eight to 10 freelancers working on the Avicii project — and Reardon welcomed the diversity in the resulting mix of visual concepts.
Special care was taken to fit the visuals — including “shatter” effects and the illusion of light glowing from “within” the sculpture — to the organic 3D shape of the giant head’s lips, cheeks and ears, using Cinema 4D, Adobe’s After Effects, and MADMapper.
Reardon notes that achieving an illusory “vanishing point” is harder to achieve when audience members are watching in a 180-degree arc. “They are very, very clever people, those guys at Vidaroo,” he says. (For more, read Vickie Claiborne’s interview with Ian McDaniel in “Video Digerati,” PLSN, July 2012, page 44).
The head projections are mapped in 3D from three points. A double stack of 20K projectors — one downstage left and downstage right — are offset slightly from 90 degrees for optimum coverage of the cheeks. There are also six 20K projectors flown at front of house, all double-stacked, for the front of the face and “leaves.”
Show Control
While there isn’t a set list for the show, there is a reservoir of songs that Avicii will choose from, and Reardon needed to make sure that the DJ had the flexibility to play any track at any time — and also be able to drop into a track at any point and have the visuals synched.
Reardon also wanted to give Avicii control over the pacing of each track, and have the visuals still match the beat. To achieve this, instead of timecode, the show runs completely off of MIDI Beat Clock (MBC).
Although some thought that idea was “bonkers,” he insists that, “if the interface is simple enough, our backup system is strong enough, and the complexity is deep enough, then you can actually let the DJ control the show. You have to break the mold, otherwise we would still be doing two sticks of truss and some strobes.”
But how do you give a DJ control of the show visuals and audio simultaneously? “To me, a successful design is one that simplifies its user interface,” Reardon says. “So the spec here was to design an interface that Avicii can use to select tracks and mix so that nothing is different in his life. I wanted to make sure that Avicii didn’t notice anything different.”
iPads In the MIx
Using TouchOSC, lighting programmer Seth Robinson created what Reardon refers to as “a very elegant solution to a very complicated problem.” Each mixing deck is connected to a laptop running a copy of Ableton Live, and also an iPad.
When Avicii chooses from a page of 35 tracks on an iPad, the track is highlighted on the iPad and flashes. When the track is ready, it turns solid, and Avicii can then hit “Play.” The bi-directional communication on Avicii’s deck display also shows him which tracks are selected on all of the iPads at the bottom of each iPad screen.
“I wrote the interface using TouchOSC, then I wrote a Max for Live plug-in for Ableton that talks to the iPad,” explains Robinson. Although there are Ableton controllers for the iPad, Robinson notes that, “because we have four copies of Ableton running at once on four different machines…[and] they only work with one copy at a time, we had to have some way to access all four machines at once, and that’s why we have the four pages on the iPads.
“Each machine has its own name — DJ1, 2, 3, and 4, depending on the page on the iPad,” Robinson continues. “You’re sending out an OSC command down to the machines saying, for example, ‘DJ2, go to this song,’ and the DJ2 machine responds by refreshing the display and making the flashing happen and all that stuff. All of the processing is offloaded to the actual (Ableton) computers, and Avicii is triggering the lighting cues by running the cuelist in Ableton.”
The master Ableton file has a track for audio, and it’s got a track for what Robinson calls the “Lighting RoadMap” cuelist for the melody, as well as several tracks for all of the video cues. As a result, the entire show is controlled from those four machines in the DJ booth.
Trouble-Proofing
Robinson also provided a visual representation of the control network, including the five laptops in the “head,” the towers controlling the media servers, and the two computers at FOH. Each control “node” is represented by a green dot that flashes red if something is amiss.
“All techs watching the network during the show will know the status of each node, and if a node is having a problem, they could actually know there is a problem before Avicii knows,” Reardon says. “This means they can fix it and there won’t be a pause in the show,” he adds. “This is something we never had with SMPTE.”
Once a track is ready and Avicii hits play, a module Robinson created, which he calls the “Select-a-tron,” comes into play. It handles switching between tracks. So that lights and video know which of the four streams of MIDI to listen to, the Select-a-tron module listens to the streams, and once a track is above 80 percent, it becomes Highest Takes Precedence (HTP), allowing the DJ to fade in and out of tracks without stuff jumping between cuelists (and creating a big mess).
Robinson’s custom programming also made it possible to remotely access and manually override the DJ booth machines from FOH. And Vidaroo’s McDaniel also provided a way to match the video to the beats per minute.
Delving Deeper
The individual four streams of MIDI notes from the DJ’s Ableton computers get converted to MSC at FOH and delivered to the lighting console, a Martin M1, with much “bolted on-ness and MIDI stuff,” Reardon says.
Robinson’s FOH setup includes the Ableton Live computer, a Novation Launchpad, an M-Audio keyboard, the Martin M1 console, a laptop running the Network monitoring module, and a laptop running the Select-a-tron module.
Working with Ableton Live and MAX/MSP, Robinson was able to create a setup where, for example, the velocity information of a MIDI trigger could be harnessed so that, if a key is pressed with more force, lights would shine brighter.
As mentioned, instead of timecode, the show runs completely off of MIDI Beat Clock (MBC). “The advantage of using MIDI Beat Clock is that Avicii can grab his BPM, and he can actually speed up or slow down a song, and we track right along,” Robinson notes. “If we were trying to lock to timecode, that wouldn’t work, because if you changed the BPM, then all of this would be completely out of synch.”
One downside there is that MBC doesn’t send precise positional information like a timecode stamp, so Robinson programs all of his lighting cues into the lighting roadmap and then offloads them to the Ableton computers in the DJ booth.
Ultimately, MIDI over Ethernet (IpMIDI) is what makes the entire show happen. “If we don’t have Ethernet, we don’t have a show,” Robinson says. “It’s integral to this show, because we have to get MIDI to FOH and to the media servers.”
Despite all the synchronized show elements, lighting control is separate from audio and video (for troubleshooting purposes, and to minimize traffic concerns); there’s a separate Art-Net network for lighting fixtures.
Reardon kept the lighting design simple, focusing on mood and timing. Along with the VL3000s, the rig includes Martin Atomic strobes with color changers and Clay Paky Sharpy, many of which are positioned on the stage.
As for the audio rig, a key concern was to work with the crew to hang the arrays so they wouldn’t block sightlines, yet still deliver high-quality sound. “The outside dimension of the leaves is 62 feet, and if we do a regular line array, which flies at 56 feet, that would be very annoying,” Reardon notes.
Automation, To Go
Aside from the visuals, the show also uses an impressive amount of automation, including an infrastructure with 100 feet of automation truss custom-built out of aluminum and reinforced with I-beams.
The gondola is lifted out of the head and flown over the audience via four motion-control winches. There are also two cable winches mounted on top of the automation truss to move the gondola upstage and downstage on rollers engineered for smooth and quiet movements.
All of the winches are controlled via PRG’s Stage Command system (the same system used on the Broadway production of Spider-Man: Turn Off the Dark.)
Because the show travels with a small stage crew, the elements had to be as light as possible and easily moved. PRG Scenic provided lightweight yet strong storage carts so that everything gets lifted off the front of the stage and packed into its cart, then straight onto the truck.
The entire contents of the gondola, including DJ equipment, remain intact during transport as well. The leaves are also hung and struck also directly from their storage carts onto the truss by attaching a section at a time, raising the truss, then clamping on the next piece until it’s all in the air.
The entire production fits within three trucks for transport: three for the head, one for automation, and four more for video, lighting, audio and rigging.