Special The future of Hollywood also passes through game engines

Special The future of Hollywood also passes through game engines

We have all seen what wonders there expect with Unreal Engine 5. Instant changes of global illumination, mass import of asset from the very high polygonal count, loading times practically non-existent (SSD permitting). We were completely dazzled by the tech demo shown by Epic Games, but it didn't just thrill us gamers. For some time now, even Hollywood has been looking with interest at game engine evolutions. There are many directors who use realtime technologies to produce their blockbusters and what we saw with Lumen in the Land of Nanite opens the door to a connection between cinema and video games even deeper.



This is what the Epic Games CTO Kim Libreri to an interview for GameSpot's, together with the best known CEO Tim Sweeny. For Kim, the ability to import quality assets on a par with that of films will allow them to create a continuum between the two worlds, a new form of entertainment. But before we delve into this groundbreaking claim, let's understand how game engines are now being used to produce movies in Hollywood.

Uncle Steven already knew a lot

One of the early pioneers in the use of realtime technology for film production was, needless to say, Steven Spielberg. During the production of the visionary AI, 20 years ago, the American director found himself having to directing a room inside Rogue City, a city so large that it is impossible to rebuild within a set. The company that took care of the special effects at the time, Industrial Light & Magic, decided to create a real camera tracking system, in order to be able to view it in real time even inside the city reconstructed in 3D. Spielberg was able to guide the camera correctly through Rogue City, despite being in front of an anonymous blue screen.



Render Offline
Unlike the render realtime of video games, the render offline create a different file for each frame of the shot, employing some time (which can take several hours) to create an image that is as realistic as possible. The various frames thus produced (usually in expensive render farms), will come later mounted in sequence to recreate the entire movie.

What Spielberg used was one of the first forms of previs (preview), now a key part of a film's development process. It indeed allows the director to view a CGI-rich scene the instant it is shot, without having to wait for hours and hours to render offline. Obviously the quality of the latter, at the current state of the art, remains even greater than a realtime elaborate. The final frames, in fact, are nowadays produced largely in huge render farm (clusters of high performance computers specially designed to produce visual effects). But being able to get a general idea of ​​the scene right away, with the footage and the VFXs together, leads to the director many advantages.

First of all is the time. As already mentioned, the director no longer has to wait hours and hours of rendering to get an idea of ​​what the final scene will be like. Can correct chamber, lighting and any other aspect immediately, then leaving it to the technicians to replicate them in the offline render. Thus freed from the yoke of waiting, the director can focus more on history, on characters direction, true centerpiece of an in-room experience. Finally, not having to repeatedly tap into the computational powers of render farms, i Costi production generals drop considerably.



Unreal Engine not only on PS5

Over the years, however, Hollywood's use of game engines did not stop only at previsualization. In fact, thanks to a fervent videogame market, game engines have experienced aunprecedented evolution, both in terms of quality and technology. As happened for example with the motion capture. Thanks to realtime technology, the director can now observe the performance of the actors wearing the suit, directly in their final setting. All of course in real time.

Special The future of Hollywood also passes through game engines
Incorrect video game:
Ep. 39: At the ethical boundaries of Virtual Reality

But the new technologies absolutely do not stop there. Although it didn't explode as much as hoped, the Virtual reality has managed over time to find its practical use in some very specific fields. One of these is precisely the cinema. Thanks to VR viewers, Jon Favreau could literally enter the settings of The Jungle Book and The Lion King, to modify in real time every aspect he wanted: like the chamber, the lighting and even the same environment 3D. Virtual Reality has allowed the director and his collaborators to be again present on the set, something that previously was no longer the case for productions completely in computer graphics.

Thanks to the VR viewers, Jon Favreau was able to literally enter the settings of The Jungle Book and The Lion King


Another ingenious application of videogame technologies in the cinematographic field was that of completely replace the green screen, opting for more sober ones 4K laser projectors. To shoot the hyperspace jump scene in Solo: A Star Wars Story, ILM decided to surround the Falcon's cockpit reconstruction with huge projection screens, on which the iconic animation was projected. This has led to two advantages. The first concerns the actor performance. The reaction of the actors was much more real and marked, thus making the scene much more exciting for the viewer. The second always concerns the cost. Five 4K projectors may seem expensive, but render farms are even more expensive. No green screen, no render farm.


Special The future of Hollywood also passes through game engines
To learn more:
Cinema and Videogames: a comparative history.

Finally we come to what I personally consider one of the most great achievements of game engines. In the Rogue One: A Star Wars Story movie, the K-2SO droid was entirely rendered with a realtime engine. After all I have listed, you will probably think this is minor, but it is not. The strength of offline renderings has always been to have a indefinitely for single frame processing. The algorithms of calculation of the final image, are much heavier than those used for our beloved video games. This leads to having to wait hours for a single frame. Game engines, on the other hand, process 30 to 60 frames in one second. You understand well that the quality cannot be the same. Yet the technological evolution of the latter has led in this case to prefer them over an offline engine.

In Rogue One: A Star Wars Story, the K-2SO droid was fully rendered with a realtime engine

The new frontier of entertainment

Special The future of Hollywood also passes through game engines
To learn more:
The revolution of Unreal Engine 5 on PS5: let's explain what it is

The de facto game engines are already part of the Hollywood production pipeline. I am now integral part of the system and movies with a strong CG composition can no longer do without it. But, despite their extreme flexibility, the beating heart is still the offline render. Until today. Yes, because what we saw with the tech demo ofU opens the doors to a future as longed for as it is unthinkable. The complete replacement of the offline render. The possibility of importing quality assets like that of cinema, combined with the vaunted raytracing, leads the game engines to aspire to the top spot in the podium of the rendering.

And if that happens, let's get ready for one new frontier of entertainment. An entertainment where the assets used for a Star Wars movie will also be present in a video game set in the same saga. An entertainment that will see forays of Game Designer within the movies and directors within video games. Increasingly emotional, immersive e Interactive. And if you are afraid that this hybridization could lead to the loss of the supporting hinges of the respective limbs, do not have them.

Because cinema was born from the union between photography and theater.

add a comment of Special The future of Hollywood also passes through game engines
Comment sent successfully! We will review it in the next few hours.