miligrab.blogg.se

Blend edges of surfaces millumin
Blend edges of surfaces millumin












blend edges of surfaces millumin

No, we don’t want a jump-cut here : what we want is a smooth transition !Īnd we don't want to use a fade-in/fade-out transition, or training for hours to get the perfect timing.Ģ / Solution : use Asynchronous TimelineThat's why we built the asynchronous timeline, a kind of time-relative timeline, allowing you to « postpone » a segment until current one is finished. If Ben arrives too early, there is an horrible jump-cut. Now, we want the UFO to capture Ben, just when he arrives : but we don't know exactly when, because he's supposed to be an actor !įor example, if Ben is a bit in a hurry and arrives in advance at the « capture point » : when you start the next segment of your timeline, an horrible jump-cut appears. and Ben representing an actor on stage : this is important because Ben and the UFO don't have the same timing. But what's happening in a real live situation ?įor our example, let’s introduce an UFO. In Millumin, you will find the same concepts. ġ / The Core ProblemClassic editing softwares, such as PremierePro or FinalCut, use regular timelines with a « red cursor » representing the playhead. Well, let's switch to another case to explain all this. When the timelines are not synchronized, we can have smooth transitions. The only difference between the two timelines, is unchecking the "Synchronized" option. Now, launch the second part and when the « ghost » is on the screen, launch the next column. What is the magic trick ?Download and open the following project : start the first part, and at some point, go to next column. They are generated according to physical models and therefore remind everyone of their own real life experience and imaginary of motion.Here is quick preview of what you can achieve with such timelines. It is a mix of control room operated human interventions and onstage sensors data that outlines a precise writing of motions and generative behaviors. This « living light » is produced by video projectors and generated in real time by a set of algorithms. The set is inhabited by a 3 face structure: two vertical panels of white gauze and a white dance oor are asymmetrically combined to create an immersive projection system. We have also built a strong amount of references and stories about the air imaginary, to build a show with a language based on the combination of live video images, bodies in motion and live music.Ĭan you describe the technical setup behind the project in your own words? Then we started to search technical issues for mechanic suspensions for bodies, and fluids movement for graphic objects. We have been working on the project for 2 years, with around 4 months of rehearsal on stage, in the theaters co-producers.Īt the beginning, there is a dream: to make the float around space, and join the images in the air. The Creators Project spoke to Adrien M and Claire B via email about their new work:Īdrien M & Claire B: Most of the video was captured in the beautiful Théâtre de l’Archipel, in Perpignan for the Premiere on the 7th of October, but some view are also been shot last week from the Hexagone Theater, scène nationale Art-Science of Meylan (in the Grenoble area). Rather than feeling like artists experimenting with new tech for the sake of experimentation, these projects feel like they belong to auteurs savvily using the tools that exist to express themselves. Back in 2012, Chris Milk used a similar concept to create The Treachery of Sanctuary, which turns onlookers into a shadowy winged beast-then lets them fly away by flapping their arms. Montreal's Maotik used a balloon to map stunning shapes onto the Society of Art and Technology's massive observation dome. Japanese artist Omote used facial tracking tech to map reactive light "make-up" onto his subjects. Their brand of performance is part of a larger trend toward reactive sensors bringing projection mapping out of its infant stages.

#BLEND EDGES OF SURFACES MILLUMIN PLUS#

Plus the dancers move to the rhythm of live music, adding one more layer of irreplicable humanity to the show. Unlike normal projection mapping, which relies heavily on pre-planning a show to fit every contour and crevice of a surface, Adrien M and Claire B let shapes and patterns emerge in response to the people on stage. Instead of the martial arts-like moves that that stunned us in Hakanai earlier this year, collaborators Rémi Boissy, Farid-Ayelem Rahmouni, and Maëlle Reymond manipulate tornadoes, columns of smoke, and lively geometric shapes in real time.














Blend edges of surfaces millumin