Artificial intelligence in cinema: How much reality is there in fiction?

Artificial intelligence in cinema: How much reality is there in fiction?

Cars that fly, old people who are not really old, beings from another planet, superheroes, robots, easy transformations; a universe of impossible things that sneak into our daily lives through movies and series that not only immerse us in a parallel world, but also raise two fundamental questions:

  • How are these effects achieved?
  • Are they possible in real life?

Intel invited us to investigate these two questions through some of the most iconic films and series of recent times, in light of the different and disruptive technologies that the company has developed and implemented.

AI training

Minority Report*

Year 2002

(Productoras: 20th Century Fox*- DreamWorks Pictures*- Amblin Entertainment*- Blue Tulip Productions*)

In the film it can be seen how artificial intelligence (AI) algorithms behave, based on the possibility of predicting the future and giving a warning to carry out the relevant actions within the command.

An everyday example is the Churn prediction, which allows you to anticipate when a customer is about to leave a company, identify them and notify marketing to take action. However, there is a very important difference with the characters in the film, since AI algorithms require prior knowledge to provide the information, while in Minority Report *, the three people who give notice do so because they have a gift.

Intel collaborates with the training of AI algorithms so that they are able to identify patterns such as from Intel Xeon Hardware or GPUs for HPC (Ponte Vecchio) and SW tools (Math Kernel, among others) for developers, in addition to doing it agile and fast way. Thanks to the advances in these algorithms, it only requires good prior information so that they are increasingly precise in their ability to understand behavior, through the information they receive.

Interactive Marketing

Back to the Future*

Year: 1989

(Productora: Universal Pictures* Amblin Entertainment*)

This 1989 classic reflects how the future was believed to be and how it would interact with the environment and others. In the scene in which a 3D shark comes out of an advertisement when the protagonist is watching it, it is the reflection of what is currently called interactive marketing.

Today we can see adaptation of advertising content on shopping websites, for example, although not all of them have customization of offers. In the case of commercials on public roads, what is sought is to also incorporate the concept of customer experience, that is to say, that the client or the potential perceives that the content is personalized for him. It is proven in the industry that this not only improves customer loyalty, but can also lead to many more sales.

To achieve this customization, a combination of various technologies can be used, such as facial recognition.

The evolution of these types of algorithms, along with the significant advancement of Intel’s HW, causes them to run on the edge (edge) without inconvenience. At edge the application is hosted and also takes other types of actions based on the reactions. Furthermore, the information generated serves as input for multiple systems and develops a much greater demand for storage technologies where availability and accessibility are determining factors.

To make this possible, Intel has several alternatives such as: Mini PCs NUCs, Intel Optane for storage and the SGX technology present in Intel Xeon processors. The latter helps the security of the applications, enabling a safe area within the processor where certain parts of the code are executed, something that when we are talking about environments where the loads are distributed between the cloud and the on prem It’s very important.

More than 30 years after the premiere of Back to the future*, we see something implemented of this type of algorithm, but many times the challenge is the context and not the technology itself. Neural networks – which are what allow these features to be easily recognized – have been developed since the eighties, but the algorithms must be trained and as at that time the amount of available images was limited, the implementation was utopian and it was perceived as fiction.

With the emergence of other technologies such as Facebook or Google, the digitization of images, storage and processing capacity grew exponentially, currently making it possible to take advantage of neural networks and the data that is generated second by second.

Reinforcement learning

Matrix*

Year: 1999

(Productora: Warner Bros*- Village Roadshow Pictures*- Groucho II Film Partnership Silver Pictures*)

The film centers on a world of AI dominance over humans in a parallel reality designed to keep the minds of humans there, while they lie in a vegetative state feeding their energy into the powerful web. AI is what gives life to robotics, this being what allows a robot to behave like a human.

The Reinforcement Learning It is an algorithm capable of learning without being previously trained, and it does so from its own experience through the establishment of rewards and punishments.

Intel contributes to the open community through RF Coach, using their platform to make training much more efficient. Intel technology is used in background so that that algorithm learns and can be a reality.

Today AI does not have consciousness, but it could learn based on its experience from previous events. Information is needed to achieve an AI system and projecting into the future, this artificial intelligence can assist man in different tasks and / or in decision-making, as it currently does in autonomous cars. The truth is that the ability to adapt to new circumstances remains, for the moment, exclusively human. To give machines our abilities, it seems, we have to give them our stories.

AI and transformation

The Irishman*

Year: 2019

(Productora: TriBeCa Productions*- Sikelia Productions*- Winkler Films*)

The film starring Robert De Niro, shows the evolution in the actor’s age and these changes in his appearance were made with AI: a work of simulation and projection.

AI enables movies to create incredibly detailed and realistic graphics, while saving time on creative iterations that work together to elevate the art of short and feature film creation and enhance the audience experience.

And how did you manage to transform your appearance? To age the actor’s face, a technology very similar to Real Sense was used[1] from Intel. The recording was made with 3 cameras, of which 2 were deep to make a 3D image of the actor’s face. Now, De Niro’s simulation involved capturing – in a span of two years – images of him from previous films. This served to train the simulation algorithm and later to project.

Since CGI was used, which is widely used in the film world since they are images made by computer, the advance allowed 3D capture to be with cameras instead of obtaining a literal 3D image of the actors.

CGI requires render times on Intel Core i9 or Core i7 processors that are used in gaming or by content creators. Intel’s new discrete GPU board can also be used, which in addition to rendering is also running the AI ​​algorithm that ages or rejuvenates the characters.

Megalodón *

Year: 2018

(Productora: The Asylum*)

This super production had Intel technology, in which the shark was generated entirely by AI in 10,000 cores and using 2,500 Intel Xeon processors.

Recreating a 75-foot-long prehistoric shark in the water – for the big screen – is no easy task. In addition to bringing the Megalodon, Scanline, and Ziva to life, it was also necessary to ensure that their movements, across the ocean or a flowing bottom, were realistic. It was possible to create Megalodon moving through the water, through the processing of a series of physical simulations and executing the simulated shark from the movements and poses necessary in the shots of the film.

One of the great benefits of using Intel Xeon Scalable processors was the generation of amazing training data. When training a process of machine learning, it is necessary to know how something will behave in order to anticipate itself or extrapolate how it is expected to behave, in this case, the movement of the shark itself. Intel Xeon technology helped the filmmakers do it quickly, efficiently, and as realistic as possible.

Intel Xeon processors powered Ziva’s character generation software and helped speed up the Ziva physics engine; an artificial intelligence algorithm that automates the movement of generated creatures, including the Megalodon from “The Meg.” Additionally, Scanline used powerful Intel Xeon processors to render film shots, saving time spent creating more and better shots.

Natural language (NLP)

Black Mirror*

Year: 2017-2019

Capítulo: “Rachel, Jack and Ashley Too”

(Productora: *Netflix)

In this chapter of Black Mirror, the GPT / BERT / Transformers theme, Natural Language (NLP) is addressed, which allows an algorithm to be able to emulate the way of writing of a human being. The plot reveals that Ashley’s robot is able to respond as if it were the character, once the blockage it possessed is released.

Although it is a behavior similar to that of a botNowadays it is possible to emulate a person, but not his reasoning; because generally a bot and from a phrase you will get what is called tried, which is what the phrase is talking about and that’s where these algorithms can come in handy. Now, once this is identified, the next steps are programmed as if it were a decision tree.

Until 2018 it was very difficult to obtain a natural language system (chat bots or real-time translators) accurately. With the rise of the transformers (BERT) begins to be possible, since algorithms are capable of understanding language in a revolutionary way. It is in this field where the next few years will show great advances such as: sentiment analysis in customer service, spelling and grammar correction, real-time translations; among other cases that thanks to this advance can be implemented very easily.

Intel helps the implementation of these algorithms with tools such as OpenVINO, being able to optimize this algorithm so that it requires the least amount of processing at the time of execution. Also, with the latest advances in hardware in how these instructions are executed, DL Boost and AVX 512 are neural network algorithms that allow instructions to be done in fewer steps, which is a key factor when implementing the same.

Many Thanks To The following Website For This Valuable Content.
Content Source Here