Animation World
Cinematography: Real-Time Animations
May 23, 2021

Cinematography: Real-Time Animations

Read...
Our in-depth Cinematography in Animation Industry today takes us into the (current) future of new real-time animation techniques.
In the animated productions industry (and not only) one of the most interesting topics as it is current and deeply projected towards the future is "real-time animations". Some software, initially designed for the video game industry, are increasingly used in the world of film/television productions, animated in particular or where there is a large use of special effects.

Real-time animation allows authors to focus directly on scenes and all moving elements, cameras, lights ect. without having to wait for the long rendering times of a classic 3DCG production pipeline [you can read the article on this topic here].

Thanks also to the technological advancement of computers (CPU, GPU ect.), these software now reach very high levels of realism, also used by large serial productions where special effects are a very important component (we will make a small example of this topic in the final part of the article).

Let's start by talking about these programs: the two main antagonists are Unreal Engine and Unity, to which is added the open source 3D software Blender.

The first two software were born for the videogame environment, in which the authors/makers focus only on the elements on stage (modeling) and on all the dynamics of the game, leaving the visual representation of the whole set to the engine of the pre-packaged software.

Precisely in this direction, given the ease of use of the program, it was thought to enhance the sections relating to the animation of the characters and the use of cameras, just like a film studio, with some very important extras such as advanced motion capture and more.

Here are presentations of the real-time animation capabilities of Unreal Engine, Unity and Blender:








Some software works on particular aspects of the productions then interface with the first mentioned engines, software that, for example, manage facial animation like iClone, with disarming ease:




But the major implementations are certainly those of the special effects calculated in real-time with the tracking of the camera in (virtual) space, with the result visible on large high-resolution screens used directly on the stage, for example instead of the green-screen in which the visual effect are added in post-production.

This technology was used for the Disney's The Mandalorian series, saving huge time in production design and post-production.

Here's how this technology works:




And here instead how the brilliant studios Industrial Light & Magic have implemented this technology for the live action series by Jon Favreau and Dave Filoni, all on a single big set. In practice, for the first time, directors, writers and actors did not have to "imagine" the effects that would be implemented in post-production, but to see them in real time on the circular screens behind them, being able to move from one set to another literally in a few hours and not weeks:




The subject is very interesting but it is also very complex and far from being concluded. So in the coming weeks we will publish an even more article focused on these techniques applied to an animated production.


[ Fra ]
RELATED NEWS
SHARE
EVENT OF THE DAY
Tweety
(Go to calendar..)
ADV
EXTRA
Texts license rules

Texts license rules
Animation World
Dot Animation Magazine
Credits - Contact - RSS - Privacy
SOCIAL: Mastodon - Twitter - Flipboard - Facebook
Brands, titles and images are ©/® by respective owners
Texts and other © 2004 - 2024 Dot Animation Magazine
www.animationworld.net | www.animeita.net