Notch is fresh. Notch is exciting. I would go so far as to say that Notch is the future of media server playback/creation. Here are a few reasons why.
What It Is
Notch is a node-based, real-time compositing software package that can be used as either a stand-alone media server or in conjunction with other media servers like d3 vR14, Hippotizer v4.2, Avo’s AI v9.1, and 7th Sense Delta Media Server. It can be used to create and play back generative effects in real-time, letting users edit and create HD content in a node-based environment.
How It Works
Node-based media servers allow the user to create compositing chains, linking together elements to create a scene. Regardless of the type of compositing chain being created, Notch scenes begin with a Root Node that links the chain to the output. Next step is to drop in a node of some type. This is when the options begin.
I followed along with the Basic Video Effects tutorial on the Notch support website (go to plsn.me/NotchBasic). I was quickly able to create a simple scene with a video source node, seen in Fig. 1. Then, I was able to easily create a much more interesting scene by adding Image Processing nodes for Key Color Mask, Blur and Color Correction. The final combined result, which strips out the color and sharpens the edge of the original movie clip, can be seen in Fig. 2.
Node Builder
Notch’s Node Builder contains a familiar number of composition tools such as Color Grading, Edge Detect, Screen Warp, and Motion Blur. The GUI (graphical user interface) of the Node Builder also has a familiar look and feel, especially for users of media editing software like Adobe Photoshop and/or Effects. This is very helpful when you’re a Notch newbie like myself.
Interactivity
Because Notch content is rendered in real-time on the GPU, it can be used to add effects to a live video capture. It can also be configured to respond to live stage data, and thus, it can be triggered through audio, MIDI or automation.
Notch + Third Party Technology
As if Notch isn’t powerful enough on its own merits, the software can be combined with third party event technology like Cast’s BlackTrax and media servers like d3 for use in applications involving projection mapping on moving objects. In a demo video featuring an application by Sweden-based Mediatec Solutions (mediatecgroup.com), the software is used to create and play the content that is being projection mapped onto an object while it is moving. This real-time compositing of the content is powerful, and it increases the creative potential of the project as a result of its rendering speed and flexibility (go to plsn.me/NotchDemo).
As can also be seen in this video, Notch’s video compositing tools can be used to create impressive 3D perspective, lighting effects and illusions on moving 3D objects. How is this achieved? The shadowing of an object can be manipulated to create changes in perspective as the camera and/or object moves. In addition to controlling shadowing, Notch can also be used to create the virtual lighting of the object while it is moving, which results in a more uniform balance between lighting and video in a scene where projection is used. And since Notch can be configured for control via Art-Net or OSC, the lighting programmer can have live control over lighting and any other parameter of a node in the Notch scene.
Generative Content
Through generative Particle and Clone effects like Snow and Rain, tracking effects like Glitter and Sparks, and a plethora of visual effects including Edge Detect, Distortion, Ripples, and Color Correction, to name a few, users can also create an wide variety of content also in real-time. I followed along with the Cloners-Text-Explosion tutorial and was easily able to create the exploding text seen in Fig. 3.
VR & Stereo Video
While there did not appear to be any VR tutorials available online when I wrote this article, the information on the creation process states that the user has the ability to create monoscopic and stereoscopic 360 videos in the Node Builder and then render them to video in the same application. It will be interesting to explore this in more detail as I’ve found that it can be quite costly and time consuming to have custom 3D content created. And Notch appears to easily fit into the workflow for creating content for VR as well, with support currently for Oculus VR headsets and others on the way.
Getting Started
For those users with a playback license who need some assistance getting a project started, Notch support makes available free source projects, effects for live video, tracking and generative content. However, while I didn’t have a playback license, I was able to download the Node Builder application and begin exploring the software. And exploring is what I will be doing for a while! With an endless number of combinations of nodes and effectors, the only issue seems to be deciding what I want to create…
For a free trial and tutorial info, go to www.notch.one. To put this new tool to work, the website prices Notch Builder at 2,500 Euros with 1,750 Euro annual renewal fee. A 99-Euro “Personal Learning” edition is also available.