Tracking and Automation Transforming Live Events Production

Real-time tracking in live event productions allows show designers to track and follow objects and people on stage, interact with media content, localize immersive audio, and enable tracking and automation for various integrated technologies including lighting, projection, media content, immersive audio, special effects, and moving scenery. These technologies can now communicate and respond to the same tracking data.

Tracking brings the advantage of keeping these technologies in sync with each other as well as with performers. Performers can enjoy freedom on stage, not having to hit marks on specific times, in fact tracking allows triggers and cues to be sent to all technologies integrated, directly from performers’ movements or positions in space. This has not only helped to streamline the production process and tighten the synchronicity between talent and technologies, but better yet, enables dynamic automation.

What is Automation?

Automation involves integrating machines, bots, scripts, algorithms, or any data-processing entity to generate various outcomes systematically, based on predefined rules and workflows. Its impact is pervasive in our daily lives, as machines take over repetitive tasks once carried out by humans or tasks that require the precision and efficiency inherent in machines.

Our digital landscape has changed as automated tasks are conducted single-handedly without human intervention. The future is being painted logically, in binary, leaving little room for error. Automation accomplishes a task repeatedly without human emotion, there is zero distraction and results are easily predicted each time. This increases productivity across the board and makes more money in less time.

As humans, we all know that mistakes are part of the learning process. What happens when there is not enough time for trial and error? Tracking technologies and automation can be a major help in such situations because it enables precise results every single time.

As demands on employees and people rise, it’s tough for humans to keep up with the vast & complex infrastructure of tasks at hand – let alone do them quickly without errors.

Automation & Entertainment

Business is business, and time is money. It is no wonder why automation has made its way into the spotlight. Automation is revolutionizing entertainment, allowing for a whole new world of possibilities in the industry. With its help, we can create experiences that elevate what audiences expect from their favorite shows – faster production times and more efficient use of resources provide opportunities to explore designs and ambition never seen before!

It’s exciting to explore the various methods of creating captivating experiences through automation in entertainment. This includes precision movements of projections and fly systems, synchronized pyrotechnics, scene-enhancing scrims, automated follow spots that adapt to human behaviors, and real-time projection mapping on moving objects and scenery.

Real-time positional data and automation are crucial for interactive experiences. To create the logic and process for an automated event, like a moving light following a performer, we must identify and understand each performer’s position on stage, know their unique ID, their position relative to lighting positions, and the orientation and properties of the lighting fixture.

What may seem like a simple element to automate, now suddenly requires an intelligent, precise and responsive tracking solution. With all that said, the human eye and our brain can still easily differentiate a mechanically planned movement, versus a smooth reactive movement from a human operator. So how can we close that gap with automation technology? This is where prediction algorithms, machine learning, location, and event triggers can help.

The ability to predict the movement of a performer or object is extremely important, not only will it account for any mechanical disadvantages of slower or aging motors. It can also apply smoothing and machine-learned human-like behaviors to any abnormal spikes before output and visible to the human eye. Of course, this amount of low latency processing and system intelligence requires state-of-the-art technology in both hardware and software to be well executed.

How to automate spontaneity?

The answer is a feature called Zones. Tracking allows for a Zone, or a predefined area in space, to be programmed to activate triggers and behaviors when interacted with.

BlackTrax’s Static Zone Feature, helps designers automate lighting changes based on performer’s location in space.

Imagine being able to simply walk into a designated area, whether it’s a real physical object or prop (e.g., A lecture/podium on stage), or a virtually identifiable area (e.g., Downstage left). Simply walking in or out of said designated area called Zones, automatically triggers an event to occur. It can be a light to turn on, a video effect to start, a smooth transition/switch to a different camera shot, etc. Endless possibilities.

Long gone are the days of worrying and hoping that a performer hits the right stage mark at the right time. The detected collision of any tracked performer or object with any Zones can be used as an automatic trigger. Hence the enhancement and elevation of automation with tracking technology and data.

Not only can you identify the physical or virtual location of Zones, but as a user, you can also define the size, height, and elevation. Creating magical moments where for example, only when a performer lifts their hand above their head, it triggers a specific event. Zones encourage creative expression by allowing tracked objects to interact with multiple layers, levels, and dimensions all at once.

To elevate the concept of Zones even further, instead of just using Zones statically set in a specific location, you can also attach a Dynamic Zone to any tracked person or object, we call it a “trackable”. That means wherever the trackable moves, the Zone that now surrounds the trackable, moves along with it. This enables various trackables, both tracked persons or objects to interact with each other. So now designers have a new level of interactivity and automation possibilities between trackables and zones or between multiple dynamic zones interacting with each other.

BlackTrax’s Dynamic Zones colliding and triggering a change in camera shot.

Imagine two cameras doing a close-up shot of two individuals being tracked, wrapping Zones around each person. When the two individuals come close in proximity, it can automatically trigger for the camera to zoom out to a wide shot to cover both individuals.

By creating immersive virtual spaces, we can unlock a new form of collaboration that marries artistry and engineering. We are laying the groundwork for the synchronization between our creative goals and technical capabilities.

Tracking and Automation continue to revolutionize workflows, saving time and money while keeping workers safe. As automation technology evolves, it will only become more integrated into our lives and workflows. With the help of artificial intelligence, we will be able to increase efficiency while reducing our dependence on manual input.

Key-light automation - Zone Design for a 165' wide and 55' deep stag
Zone Design for a 165′ wide and 55′ deep stage, each Zone area tracks with a different set of light fixtures.

 

Tracking and Automation are here to stay – and it’s only getting smarter.

keyboard_arrow_up