Many media servers feature a control channel for Media Play Speed. How does this feature affect your content? Have you ever used it? Well, if you have, you will very quickly know whether or not that media server uses frame interpolation, also called frame blending or video smoothing. When a piece of content is created, it will be rendered at a specific speed in frames per second, or FPS. Typically, that value will be15, 25, 29 or 30 FPS, depending on the format of the media, the codec and even the hardware being used for playback. If a piece of video content is rendered at 30 FPS, what happens when you use the Play Speed control channel and slow the movie down? That actually depends on the software and media server. When a piece of 30 FPS content is being played back at its rendered speed, all will appear normal, and each frame will blend cohesively into the next. But when that same piece of content is played back at 15 FPS (overriding the content’s rendered frame rate via the Play Speed control channel), the content will be playing back at half of its rendered speed, and it can appear “jerky” or “choppy” because you have time-stretched the footage — that is, unless the software can “ fill in the missing frames,” which is the effect of frame interpolation.
How Does Frame Interpolation Work?
Frame interpolation is the process of creating intermediate video frames based on the data in two consecutive frames of encoded video. Technically, pixels are displaced by mixing pixels from the source in the current frame with source pixels from previous or future frames. Basic frame blending is used to compute intermediate pixels and to produce anti-aliased results in the render. In effect, frame interpolation increases the frame rate of encoded video at the time of decoding. Essentially, the content is rendered with a codec, or compression/decompression information. The decoders in the media server’s software can compare the information in the frames of the movie and interpolate the differences between them, thereby filling in what is missing. The algorithms being applied by the decoders compensate and estimate the motion and smoothing, which creates smoother motion at slower FPS values. These algorithms also do not involve any special encoding options; this means that they do not add any overhead to the content, and won’t make your content larger.
In the lighting world, we are very accustomed to being able to increase or decrease the speeds of our effects with a control channel without compromising the smoothness of the effect. Rotating a gobo is just one example, as are pan & tilt. Remember when you would program an 8-bit pan/tilt fixture on a DMX console and try using a really slow fade time? The result made lighting designers cringe. As a result of that feedback from designers, the manufacturers of intelligent lighting fixtures soon doubled the number of pan & tilt channels and increased the resolution of a pan/tilt crossfade to 16-bit, and we suddenly went from a mere 256 bits of data in a crossfade to 65,536 bits of data in a crossfade. Once that change occurred, pan and tilt smoothness during a slow crossfade quickly became the signature of a quality automated lighting fixture, and all manufacturers followed suit by offering full and reduced resolution modes for their fixtures.
Now, here we are in a lighting world that is quickly converging with the video world, but we lighting designers and programmers expect the same results from our digital lighting fixtures that we already get from our automated lighting fixtures. Thus, manufacturers of media servers are being pressed to make improvements in order to match our expectations. A media server that offers frame blending has the benefit of being able to generate higher quality slow-motion video since it “inserts” newly interpolated frames into any gaps between frames, which we perceive as spatial motion smoothing. This is where companies like Green Hippo are leading the way, with their latest version of the Hippotizer.
I had the opportunity to sit with the developers of The Hippotizer while at LDI in October to check out their latest version of software, v3. This version of their media server has an extremely well designed frame blending feature that makes content appear remarkably smooth at really low frame rates, and this makes it an incredibly powerful digital lighting tool. It is also one of the first digital lighting media servers to make use of a frame interpolation technique to produce these impressive results. In fact, it does interpolation so well that it is next to impossible to tell that the content was not created at the lower frame rate. I believe that all media servers will need to perform at this level if they want to be competitive at the pro level, because products like the Hippotizer, with it’s ease of use and powerful playback capabilities, will keep raising the bar. Advancements in new technology and hardware, along with the development of new software that makes use of those technological improvements, contribute to increased performance capabilities of our media servers, and they will only continue to shape the future of our digital lighting world.
Vickie Claiborne (www.vickieclaiborne.com) is an independent programmer and training consultant, and can be reached at vclaiborne@ plsn.com.