LONDON – As the finale to the 400th anniversary of Shakespeare’s death, the Royal Shakespeare Company (RSC) presented a dazzling new production of Shakespeare’s The Tempest in the Bard’s birthplace of Stratford-upon-Avon. The innovative staging featured live motion capture, which allowed Ariel to shape-shift and become characters such as a sea nymph and harpy, all in realtime.
More details from d3 technologies (www.d3technologies.com):
To accomplish this, the RSC collaborated with Intel and The Imaginarium Studios to use Vicon’s Tracker, a powerful motion capture system and object tracking solution. d3 Technologies media servers played a key role in video mapping the motion tracked, moving projection surfaces on stage. Three d3 4x4pros plus an additional understudy, all with SDI VFCs and 16 outputs each, were employed along with the RSC’s own d3 2x2plus.
“The Tempest” takes place on a strange and magical island and the new production set design featured the inner ribs of a shipwreck, rising up two levels. At the back was an eight-meter high curved rear projection screen. In the centre, 14 curved gauze screens, known as the Vortex, flew in and out together and independently, creating the surfaces for the narrative projection. Within the Vortex was a cylindrical gauze screen, called the Cloud, which tracked on a spiral. All the tracking information for these surfaces were fed into d3.
“One of the main uses of video was for Ariel, a magical video character, driven by live motion capture technology and relayed to the stage through d3,” says video designer Finn Ross. “The Imaginarium Studios designed the various viral characters and the Unreal Game Engine workflow that rendered Ariel in real-time, which we took as a video capture stream into d3 and mapped onto various areas of the set.”
“The other major task of video was to deliver The Masque, a fantasy show put on by Prospero to celebrate his daughter’s wedding,” Ross continues. “This section exploded with color and David Hockney-like landscapes; the stage turned into a giant peacock enveloped by a huge rainbow aurora. Every surface and all 27 projectors were used for this.”
The RSC’s production manager, Pete Griffin, notes that the company “had never undertaken a production with this scale of video requirements before. More than two years ago we started looking at what video servers we would need for the project. The general opinion seemed to be that d3 that was the way to go. d3 offered proven, stable integration with Stage Technologies automation and grandMA, both of which we used in the production.”
George Jarvis, video production engineer at the RSC, agrees that d3 was “the obvious choice. It was absolutely essential to have a 3D server engine due to the nature of the show and the use of motion tracked, moving projection surfaces. “
“The Tempest” was programmed with a mix of timeline and Sock Puppet-based control. “This flexibility allowed the team to work in the most effective method, depending on the type of sequence and the complexity of certain elements,” says Jarvis. “The timeline was then controlled with a DMX transport, which enabled the video operator to cue the show from a lighting console. This was essential as it allowed us to integrate with the MA control infrastructure in the theatre.”
Jarvis believes that, “It would have been very difficult to work out the projector spec and placement without using d3 in previs. I was able to accurately design the projection system with complete confidence that we could deliver the required coverage. As so many of the surfaces in the show were curved, moving and overlapping, it was incredibly useful to be able to visualize the projection beam angles and other properties using d3’s footprints feature.”
He calls d3’s QuickCal “a great feature that allowed us to quickly and accurately line up multiple projectors with ease. The ten projectors covering the Vortex surfaces were lined up to a specifically designed calibration object. This allowed us to use QuickCal to line up multiple curved surfaces, which individually had very few suitable calibration points, in one fast process.”
Ross liked how “d3 works with combinations of perspective maps and direct maps, so it was very fast to make content for The Masque section of the show and get a huge impact on the stage.”
d3’s ability to have “up to 16 HD outputs from one server in such a compact size” was a definite advantage for the production as was “the option of configuring the outputs as required with the SDI VFC cards,” says Jarvis.
According to Griffin, “d3 was key in helping us bring the production to fruition,” especially when d3 and Vicon teamed to create a plug-in app for the PosiStage.Net protocol. The plug-in enabled both video and light tracking to share the same tracking data protocol from the Vicon camera system.
“We couldn’t have done the show without the support of d3, both in pre production and during the tech rehearsal,” Jarvis agrees. “Tom Whittock developed the app for posistage.net, and we received on-site support from Christian Dickens during tech.”
The d3 system “performed robustly and reliably for the whole run in Stratford – 82 performances, which although expected from my point of view, is still pretty impressive, given how hard we were pushing the tech,” Griffin reports.
“The Tempest” will continue this summer at London’s Barbican theatre.
At the RSC Gregory Doran was the director of “The Tempest” and Stephen Brimson Lewis was Production Designer. Ingi Bekk did the grandMA programming and multiple systems integration; Sam Jeffs handled d3 systems engineering. Tawny Schlieski was Director of Research at Intel and Ben Lumsden was Head of Studio at The Imaginarium Studios.