Obsidian Unveils the DYnamic Lighting Operating System on the ONYX Console
Obsidian has been working hard on their ONYX line of lighting consoles. After a year of seeing prototypes of their DYLOS software in action at trade shows, it was a pleasure to sit down with Matthias Hinrichs of Elation for a quick demo of what this new software is capable of doing. I’m happy to report it is simple to learn and easy to execute, just like the console.
What is DYLOS?
The easiest way to explain this software might be to call it a built-in media server, but in actuality, it wasn’t designed to play back media content to an LED screen or projector. It’s more of a Pixel Composer, whose purpose is to map video files to the color systems of lighting fixtures; as a way to quickly program dynamic lighting effects. This may be a new take on an old concept, but DYLOS takes it to a new and much higher level.
Nowadays, with the addition of LED pancake lights, battens, panels and profiles that do not rely on color flags, fixtures can change color as fast as any video wall. That’s where this software really excels. The timing has never been better for consoles to be able to tell any light fixture they can be a pixel. Or in the case of multi-celled fixtures, many pixels. A designer can now take a whole wall of lights and point them straight downstage and create a low-res video wall (or supplement an actual one), painlessly. The ONYX console maps any specific section or fixtures on that rear wall to an area of pixels on a video file.
An example of how DYLOS works may be easier to understand. Suppose you have an LED wall upstage. Then you surrounded the perimeter with 100 Elation 360-i moving heads. You decide you’d like an American flag waving on the screen. But to make it all look cooler, what if you could zoom that image out and now the lights surrounding the flag are waving in red, white and blue colors as well? You will have combined the video and lights to form a visual spectacle.
While one day the ONYX may be able to feed video walls themselves, at the moment they are calling this function a Pixel Composer. Reason being, the DYLOS system doesn’t need the user to import any media files (though they can). They encourage users to easily make their own utilizing the unique software.
Applying these movie files to lights allow users to build unbelievable chases in seconds, ones that cannot be made by any effects engine and would take hours and hundreds of steps by conventional programming.
Building Your Own Library
Nothing has to be installed on the console, as DYLOS is included in the latest software. The best way to think of DYLOS is as its own lighting fixture (or five, for that matter). The same as you would think of a media server. The media library comes with 1,100 video files, all royalty-free, from the factory. The files are sorted by basic color schemes, such as red or green folders, as well as monochromatic. The user can add any file of their own and the ONYX formats it for use. The ONYX will accept almost anything from .mov, avi, and mpegs as well as jpegs, pngs and most still images.
To lay out your pixel map, the user has to build a zone. This zone represents a particular group of fixtures that you would like to include in the pixel mapping process. The aerial rig could be a zone, and the side torms could be a zone. The floor lights, or the whole rig, for that matter, can be a zone. You are currently allotted five zones for your show. Lights can be in more than one zone; that doesn’t really matter. The video file you play back has to have a zone attached for it to function.
To map each light fixture to the section of a video file you wish it to mimic, DYLOS utilizes 2D Layout views, (a monitor view resembling a light plot). The pixel mapping is not precise, like one might expect when using an Image Pro to move an image a few pixels in one particular direction when mapping it. Instead, the user estimates the relative fixture placement on the 2D map and surrounds it with a box to designate which lights will be in that zone. The user can then put an image up in a zone over the layout view, then while viewing the actual rig with the image shining, slide a fixture around in the layout view until they are satisfied the mapping is aligned. Users must remember they are dealing with low-res video here when mapping fixtures, so precision placement in a zone is not demanded. It is intentional that the user cannot assign a light to one particular pixel.
Playback and Manipulation of Video
Once mapped, one controls the media to the light fixtures the same as they might on any media server. Intensity can be controlled on this like a fixture, as can the contrast and hue and saturation of the image. The fixture will always play back the image in the way the user has manipulated it in the DYLOS cue itself. If a fixture is placed in pixel map mode, the programmer no longer has control of the individual color flags or color diodes via the usual attribute wheels. If a user wished for an image to have more red than normal, they would adjust the attribute on their zone as opposed to the fixture’s attribute wheels.
Zooming an image works as you would imagine. If one zooms out large enough, your whole slight rig may just be a few pixels wide with many lights operating as the same pixel. Likewise, if you zoom down smaller, you may inadvertently turn off the shutters on lights no longer recognizing any pixels. Panning left and right can take the image off center. Users can adjust the image like an aspect ratio, by stretching just the X or Y coordinates of the image.
Users can rotate the image in any direction, or spin it continuously. The files can loop, bounce back and forth, and repeat backwards or forwards. But one thing to remember is the programmer cannot use any of the fixture’s color encoders when playing video content through that fixture. The DYLOS only talks to the color on a moving light, commandeering the color flags while it’s playing.
Monochromatic images such as black and white vertical lines can be scrolled across the zone to create an intensity chase in moving lights, just by using the color system. The way the DYLOS looks at fixtures that have no color attributes is to translate the color info into intensity values. Much like a black and white TV would, a yellow color would appear as a brighter light grey hue, whereas a blue would translate to a darker image on that screen. Since the intensity would vary with the saturated color, you could still pixel-map a wall of PAR cans with the color of the video playing transferred into various intensities.
Users have the ability to add an effect to the video as well. By this, I am talking about the canned effects that come with the ONYX. For instance, one may wish to “tile” the image so multiple cells of the same file appear on your view. The image can be tiled into groups horizontally, vertically or both. The user can make the effect be a line or a square, for that matter, and then size it individually until it looks good to the eye. The best results, it seems, would come after users have a chance to play around with the attributes while looking at the actual light rig until they are satisfied with what they have created. The effect can be adjusted via playback speed, direction, or one could put an effect such as a tilt sin wave on the image.
Each DYLOS Preset can hold the source (the image) as well as two effects slots By effects slots, I am referring to a color or tiling effect being added via a separate encoder on the console. Each Preset can also include a mask to black out parts of the video you may not wish to be mimicked by certain lights. There are an additional four parameter encoders for every effect to adjust the chosen attribute. There are so many different ways to control the effects. The user can, for example, take a particular image and break that image into different color zones. DYLOS will shift the color around while the clip is playing live. Say you have a stock flame image playing. You can tell the DYLOS to take any part of the image that is black, and turn it green. Where the fire is red, make it blue. Where it’s yellow, I want it magenta now. We have easily taken the clip from one color of fire to a new imaginary one.
Once a video is modified and created, it can be saved as a preset/palette like any other attribute on the console. This can then be copied and manipulated in another cue multiple times. But be warned the ONYX does not offer partial merges of shows. Users can go crazy and make as many custom videos as they wish, but their work will stay with that particular show file. Of course, one could always start a new show file from an old one, then delete all the cuelists and repatch a new show with all these video files still left over.
The DYLOS is capable of running any text across a group of fixtures. Say one has a row of Elation CuePix i16 fixtures (a 16-cell square static block of LEDs) strung closely together across a truss. One simply types in the text they wish to display and apply it statically or scroll it across all the fixtures and loop it. The system uses its own generic font, but users can import their own text files. Users can also assemble their own library of text samples.
The DYLOS comes with premade generators. While this is not a particle generator per se (yet), it enables the user to create video files from scratch without building off a preexisting media file. Obsidian encourages users to make their own video files from scratch, and this function offers a good starting point.
In the end, we have a Pixel Composing engine that is programmed like a media server. The DYnamic Lighting Operating System is intuitive, quick to program and opens up a whole new approach to adding intricate lighting effects into cues quickly. Make note, DYLOS is just going to get better from here.
For more information, visit www.obsidiancontrol.com/dylos.