Upfronts are standard industry events where media companies and studios showcase their latest products and trumpet their latest programming to advertisers and the media. (For MTV’s 2013 Upfront, see PLSN, June 2013, page 42.) But when you work on the YouTube Upfront, things are a little more hectic and involved.
Since it was founded in early 2005, YouTube has grown into an Internet media giant and a household word. Acquired by Google in 2006 for $1.65 billion, the subsidiary is now in partnership with CBS, the BBC, Vevo, Hulu, and other organizations that provide content to supplement the website’s video-sharing offerings. The site is currently ranked third in terms of total visitor traffic, behind only Facebook and top-ranked Google itself.
Last year, YouTube had its first media upfront, staged at the Beacon Theatre in New York City, with both Good Sense & Co. and WorldStage playing key roles in terms of event production management and technical AV support. This year, in addition to a splashy presentation, YouTube’s upfront, held in May, was paired with a major event, with approximately 350 people working to create an unforgettable experience for 1,500 VIP guests, who were treated to a lavish dinner and after-party aboard two docked yachts. The entertainment included a performance by a well-known rap/DJ pairing, Macklemore and Ryan Lewis.
“As a young company, it was our largest, most challenging and most complicated event,” said Jared Siegel, co-founder and production director of Good Sense & Co. “There were a lot of different things going on,” along with a few shifts and changes that led to overnight programming sessions and long hours for stage management.
“It was definitely a complex system, from the venue to the time frame to the fact that we didn’t have content until the house was opening,” agreed Josh Perlman, event manager for WorldStage, Inc., the multi-media technology company providing and capturing the audio and video for the upfront. “They were still trying to make changes as we were opening the house, and we actually had to cut them off. It was an ever-changing, ever evolving situation. You work with your team of programmers and figure when your real cut off is to get stuff into the system and actually check it so you aren’t running it live for the first time when people are in the house. You back up the timeframe and say, ‘This is the last time I can accept content,’ then you stick to it. Because if you don’t, you open yourself up to all kinds of problems.”
Siegel noted that a lot of the content was driven internally from Google’s team, with an executive producer dealing with video houses Obscura Digital and mOcean on putting together all of the graphic content for the executive presentations and the musical performances. “We worked closely with them to make sure that the formatting and the playback was all the way it should be,” said Siegel. “Obscura Digital and mOcean are both West Coast-based teams, and the executive producer was on the West Coast as well. The Google Events producing team is here [in New York]. Obviously, our team is here and all our vendors are here, so it was a little tricky. We shared lots of Google docs.”
The deadlines for the type of video content varied, depending upon what it was. “Some of the stuff off PlaybackPro we were able to accept up to 30 minutes to doors,” recalled Perlman. “For graphics coming off of Watchout, I think the final drop dead cutoff was 3 or 3:30 in the afternoon on the day of the show, which give us enough time to go through it again if we had to. Doors were at 6 p.m., so that would give us about two and a half hours to make any last-minute changes. But we had a dedicated team of people who just focused on receiving content from people and getting it preloaded into our system, so that way our programmer could drop it and put it in the right place but wasn’t worrying about accepting the delivery and figuring out where they go. We had a person dedicated to transcoding and doing all the cutting and editing that was needed to put it into our system.”
Setting The Stage
Putting together the YouTube upfront took close to a year. Good Sense secured the deal in July and spent the first few months venue-scouting and exploring different budgeting scenarios. Perlman recalled seeing initial designs in late fall. Full-time work on the project started in April.
“ShowMotion handled the automation and scenery and did an amazing job,” said Siegel. “The automation was pretty complicated, just in the fact that there was a lot of weight” to move around, “and it had to be installed really fast. The rigging was rather complicated as well, just because we had to spread the load out over the roof of that building. The main primary video wall, which we called our hero wall, was 20 by 70 feet and split down the middle, and it had to track 35 feet in each direction. Then we had two more 12-foot-wide by 20-foot-tall walls [on each side] that tracked upstage and downstage, allowing bands and presenters to enter.”
“We had 1,512 active tiles of Barco C5 LED wall,” said Perlman. “The main wall was about 15 tiles tall and 52 tiles wide, and we had two 9-by-15s on stage right and stage left of it. Then we had a ton of strips of single-, double- and triple-wide LED tiles that were doing a creative run off to the end of the stage.”
Another important piece of automation was a 20-foot-long by 8-foot-wide wide stage elevator catwalk that rose out of the main floor after the video screens in the center split so that Macklemore and Ryan Lewis could run up and down the stage for the final performance. Good Sense brought in 250 fans that ran out on stage with him.
“The show’s theme was the YouTube community, the people who make the content and who watch the content,” stressed Siegel. “That was our way of bringing in the community to the show. We had this tent out front to hold all these fans and keep them away from the artist trailers and move them in and out and feed them. It was just one of the many additional elements that were going on.”
“There were definitely a few nerve-racking moments along the way, and we were doing some things that a lot of people hadn’t done before,” stated Perlman, noting that “1,500 tiles of LED tracking and moving around might’ve been done before on this particular product, but I don’t think it had been done in a building with such a tight ceiling. We were literally talking about fractions of an inch for the entire thing to fit in. It was a complex thing. We worked very closely with ShowMotion to make sure everything actually fit in the room. There were a couple of nerve-racking moments as we were going up with the truss and putting in the stage under it and hoping it would actually fit all the way in the way we thought it would. Luckily, it did.”
“ShowMotion did a big part of the show, and the people over there were amazing,” said Perlman. “Ashley Bishop was the project manager over there. Jon Cardone managed and was responsible for the on-site build. Those guys were absolutely amazing and a really good team. I went up to the shop a couple of times to work with them with the tiles, figuring out the best way to put it all together and make it work. They were very easy to work with and collaborative team members.”
Shooting The Space
WorldStage utilized three Spyder X20 frames to drive their video system along with an 80×80 DVI router driving the Spyder. They had a 60 x 64 DVI router managing the Watchout system to handle nine primary outlets and nine backup outlets along with compositing and other functions. Perlman noted that there were a lot of moving parts and a lot of big systems all working together. They had five Sony HD cameras that were shooting the entire show — two robotic cameras, two front of house cams and a handheld — “that had to get switched, recorded and dealt with as well in addition to all the big, dynamic showpieces,” he said. “Everyone was kind of joking on the show that normally you call it video village, but this was more like video city backstage. We took up a lot of space. There are a lot of pieces and a lot of people working in one area to make the entire show happen.”
“It looked like NASA’s control center,” declared Siegel. “It worked out really well. Lighting seemed to be the easiest of all. Chris Dallos and his programmer Joe Allegro ran with it and kept up with all the changes. They did a great job. Lighting was the one thing I didn’t have to worry about, and there was a fair bit of lighting in terms of the quantity of instruments.”
“In terms of general complexity and timeframe, it was definitely one of the hardest shows we’ve done recently,” stated Perlman. “We do a lot of upfronts, but Google and YouTube want to be different and push boundaries. They want it to be different, better and cooler, and because of that, you’re always living a little bit more on the edge of what has been done before. Some companies are on that cutting edge of wanting to push themselves and push the look, and you have to work with the right team in order to execute something that hasn’t necessarily been done on an everyday basis.”
As Perlman pointed out, almost anyone can go into a venue and set up a projector and a small screen and record it with a camera, “but getting a team together that can put together larger LED walls, big HD projection, HD cameras and have the entire thing literally move on automation together, and having a team that can actually pull that off — whether you’re talking about being able to rig in a venue that wasn’t meant to be rigged in this way to talking to the ShowMotion guys who had to design a custom track piece in order to attach bumpers for the LED that still gave us the ability to level out the LED wall when we didn’t have space for turnbuckles or anything else — everyone had to go back to the drawing board and reimagine how they would do it. They had to be willing to work outside of their comfort zones. I saw Jared and his team after the show, and they were definitely breathing a big sigh of relief as well as everybody else.”
New Lessons Learned
A production team can always come away learning something new from an event of this size, and the YouTube Upfront proved to be no exception.
“The big thing that we may have overlooked the most was just the operational elements, because we were not in a theater,” explained Siegel. “We did this event in the Beacon last year — you had seats, front of house stage, stage and bars and all that stuff — whereas this year we had to build everything. It was a really large venue and a lot of space to fill. Again, the logistical side of putting in all of these additional operational elements and functional things was not new to us, but something that we maybe underestimated the time on a little bit. We did end up doing around-the-clock shifts, which we had planned for, because we knew that it was growing. The other thing is we didn’t have the artists confirmed until the week of, so that’s always challenging, trying to plan and pack trucks and load gear and be ready for anything that comes up.”
He added that the registration setup was very large. YouTube wanted to scan everybody in and have custom printed badges for everyone, plus there was a pre-cocktail reception out by the water behind the venue. The big surprise were the two giant docked yachts that served as the scene for the after-party. “We didn’t want anybody to see the yachts, so they had to come in, check in, sail away, let all the guests enter, then dock in during the event,” stated Siegel. “They were basically the after-party with a dinner for 1,500 people. The doors opened up back on the waterfront to go out the way they came, and they were divided among two boats where we had capacity for 1,800 people. We had little performances on the boats and food and drinks. The boats didn’t leave or go anywhere. It was like bringing two big floating restaurants right over to our venue.”
Good Sense handled the boat after party with a separate production team. They took all of the existing furniture off the boats, stored it and then brought in furniture requested by YouTube. “For a couple of days, we just loaded furniture onto boats and set them up in lounges,” said Siegel. “They didn’t like the tops of tables, so we had graphics printed and put graphic stickers on the tables. There were a lot of graphics that were installed inside the boats.”
A Big Success
After all the blood, sweat and tears were poured into the show, it was a success, even if it generated some frazzled nerves behind the scenes. But for some, that’s the simple price of progress. Not that anyone in the audience probably noticed. To them, it was one big, seamless production.
“It was a very cool show that looked amazing,” beamed Perlman. “It was definitely one of the coolest looking shows. The C5 wall is one of the best looking walls out there, especially when you go this large, because they actually calibrate and work the right way together. It actually appears to be this big, bright, seamless wall, and the guys at mOcean and Obscura Digital did a great job developing content that looked amazing on that wall. The entire pixel space we were working in was just south of 16,000 pixels wide for the entire thing, so you’re talking about creating some very large file elements, and they did that very well. They really understood how to work with the technology. With the show, they [YouTube] wanted to have this immersive feel to it. There was this semicircular video that wrapped around everything, so no matter where you looked, there was something to look at.” And, hopefully, to remember.
CREW
Production Director: Jared Siegel,
Good Sense & Co.
Production Manager: Michael Madravazakis, Good Sense & Co.
Event Manager: Josh Perlman, WorldStage
Lighting Designer: Chris Dallos,
Dallos Design
Scenic Designer: Anton Goss,
Consortium Studios
Lighting Supplier: WorldStage
Video Suppliers: WorldStage,
mOcean, Obscura Digital
Scenery: ShowMotion Inc.
Generators: GreeNow
Soft Goods: Dazian
Graphics: Color Reflections, Mirror NYC
Confetti Cannons: J&M
GEAR
2 MA Lighting grandMA2 Full
consoles w/ NPUs
58 Elation EPAR QA LED Pars
32 Martin MAC Aura LED Washes
20 Coemar ParLite LED fixtures
18 Vari*Lite VL2500 Spots
12 Vari*Lite VL2500 Washes
12 Vari*Lite VL3500 Washes
12 Clay Paky Sharpys
10 Martin MAC Viper Spots
18 Chroma-Q Color Force LED 72”
16 Color Kinetics ColorBlast TRX 12”
2 High End Systems Showguns
237 ETC Source Four PARs
65 ETC Source Four Ellipsoidals
20 ARRI Studio 1K Fresnels
6 MartinAtomic Strobes w/ Color Scrollers
2 Lycian 1.2kW Starklite 1271 HMI followspots
2 DMX Switch 4s-Universe A/B Smart (Data Lynx)
2 16 Port Ethernet Switches (10/100Mb)
4 7 Way Opto Splitters
2 UPS 500 Kits
3 PD Cam Spider Box 1×5
2 ETC 96 x 2.4k Sensor Plus dimmer racks
3 400A PD 208v 48X20A
1 400A PD 110v 48X20A
2 200A PD 110v 24X20A
2 City Theatrical PDS-750TRX units
2 DF-50 hazers
2 Rosco Delta 3000 foggers