What is a “Media Server?”
Put them together, and the two simple words, “Media” and “Server,” can mean different things to different people, often resulting in a bit of confusion,… Read More »What is a “Media Server?”
Put them together, and the two simple words, “Media” and “Server,” can mean different things to different people, often resulting in a bit of confusion,… Read More »What is a “Media Server?”
This question comes up all the time, “How do I get started with media servers?” Well, the short answers are training and practice. Practice is… Read More »Training
At Prolight+Sound 2018, I found myself engaged in several discussions about the need for a way to take the output from a media server and… Read More »Video Over IP: Is NDI the Solution?
The term VJ (a.k.a. Video Jockey) primarily refers to any type of show where the visuals being played are created in real-time through combining image… Read More »Mixing Visuals Live
When you are given a project that will require a variety of images and clips to be displayed on LED walls with unusual orientations, how do you prepare for programming? Recently I encountered this scenario and in the process of working out the details, I found a unique solution to this very specific situation that I think deserves sharing.
As pixel mapping is becoming more and more commonplace in the lighting world, and LEDs are being incorporated into 3D configurations to create scenic objects like chandeliers and cubes, the challenge of programming them increases. Instead of looking at the pixels in a 2D flat “plan” view, now we’re beginning to think about looking at pixels from all sides and angles.
From the beginning of the creation of media server technology for live events, it has been common for a server to have only one or two video outputs. With the increase in popularity of LED walls, multiple projector projection blends, and 4K resolution content, however, the number of video outputs required from a server has grown and, as in the case of the d3, as many as 16 in a single d3 server are now needed in order to accommodate the demands of today’s visuals used in productions.
A Look at Vidvox.net’s Open Source Hap Codecs, and How They Work
I’ve written about codecs in the past, and generally speaking, they more or less do the same job: compress and decompress video frames. And with new ones being created continually to address specific performance-related needs, it is not reasonable to definitively say “this one is best” without taking into account external factors like hardware, application needs, etc.
The concept of Worlds on a grandMA2 is very useful when programming media servers into a show that also includes moving lights because it can be used in various ways to filter in and out attributes during programming and playback. I will admit, however, that using Worlds in this way was not intuitive at first, given that we have other means of isolating fixtures and attributes, namely Filters and Masks. But once I started exploring further, I realized Worlds could be used in several ways that I hadn’t thought about before.
Notch is fresh. Notch is exciting. I would go so far as to say that Notch is the future of media server playback/creation. Here are a few reasons why.
On a recent project, I had the opportunity to use the Bitmap Fixture Engine in a grandMA2 console and, as a result, I discovered an easy new way to pixel-map. Here are some of the reasons why I’ll be using it again in the future.
The annual LDI tradeshow, one of the largest lighting and production equipment conventions in North America, has become, for many of us in the lighting, video and production industries, a place to see old friends, make new friends, see new products and develop new business.