What happens when an IBM supercomputer with artificial intelligence (AI) analyzes a hundred feature film trailers and then independently edits a commercial trailer from newly shot, completely different footage for a Hollywood movie?
The idea is genius just from a marketing standpoint:
Hollywood studio 21st First Century Fox, part of Disney as Fox Corporation since spring 2019, produced with director Luke Scott, the son of cult director Ridley Scott (including Alien, Blade Runner, Gladiator, Thelma & Louise) a Science fiction film in which a young woman turns out to be an artificially created human with synthetic DNA. The theatrical trailer for this feature film was, logically!, created with artificial intelligence.
You need to know
- Artificial intelligence already plays an important role in many professional programs for editing videos and films. For example, in tracking, light determination and many other functions of image and sound processing.
- Self-learning programs such as Watson are becoming increasingly proficient in the ability to recognize image content and narrative structures, which is fundamental to independent image montage.
- The autonomous editing of short video clips with AI is already being used successfully today in the TV broadcasting of sporting events.
Using artificial intelligence (AI) to automatically edit videos?
If you ask an editor if videos and movies might be created autonomously in the near future with the help of artificial intelligence, he’ll flip you the bird.
This reaction is nothing but human. Federico Fellini, the Italian star director of the post-war period and one of the most important auteur filmmakers of all time, when asked why he didn’t work to train the next generation of filmmakers in his home country, replied, “Am I crazy! They’d take all our jobs!”
But if you ask the same editor if he edits his videos with Adobe Premiere, chances are you’ll get a positive answer. Which is tantamount to a commitment to artificial intelligence.
Because already today the different software packages of Animoto or the Adobe Creative Suite (Premiere, After Effects etc.), which is used worldwide, work with artificial intelligence. This not only helps with error analysis, but also analyzes user behavior online to automatically improve future program versions and already helps with dozens of functions in the background.
More than just film
So there is already a certain degree of automation in post-production. But not only there.
Film cameras have long been using AI to optimize their autofocus and exposure control. Similarly, algorithms are already training themselves to analyze and even write stories.
The intention to be able to edit movies with a neural, self-learning computer program is far more than just a marketing stunt. There are tangible economic reasons for wanting to automate editing: Every film shoot produces a huge amount of material that has to be sifted and edited. In the case of a documentary film, the amount of material to be processed is even higher compared to a feature film.
While, conversely, in the news or in the TV business, which is crumbling away, there is no time or money for careful selection. In short: billions can be earned with self-learning software for automated film editing.
IBM. The major American information technology company (350,000 employees, $80 billion in annual revenue) has been at the forefront of advancing artificial intelligence since 2011. This with a program that became known as Watson. It can also cut video.
Watson is characterized by an extreme ability to learn. This is not reduced to the moving image.
- An autonomous bus was presented to the public at CeBIT 2017. Controlled by: Watson.
- As of the end of June 2018, physicians in more than 230 hospitals worldwide relied on assistance from. in their search for therapies for cancer patients: Watson.
How does artificial intelligence work?
Machine learning, at its core, is about recognizing patterns: Learning from experience. The necessary training of a software can be done by human intervention. Or the system itself creates rules on the basis of which it constantly evolves. In so-called Deep Learning, depending on the information the system receives, its weighting also changes and thus the underlying insights. Just as happens with a baby who is getting to know the world anew.
Watson is now accessible to people like you and me via the IBM Cloud. With Watson, you can analyze or visualize data or create intelligent chat bots or digital assistants. In some cases even free of charge.
For the theatrical trailer of “Morgan,” Watson first analyzed 100 trailers of comparable films. The artificial intelligence then reduced the finished, 90-minute film to 6 minutes for the first time.
This process took 24 hours. In the process, the program subjected the film to a visual and auditory analysis, independently worked out a scheme of the film’s structure, and in the end compared the results of its own work with what it had learned from films of the same genre.
With AI-edited trailer “Morgan”:
Watson didn’t go all the way to the end, though.
In the end there was no finished trailer (yet). But ten key sequences identified by the computer, from which the cinema trailer was assembled by hand.
Nevertheless, it is only a matter of time before the role of artificial intelligence will no longer be limited to identifying attitudes that drive target audiences to purchase based on Big Data.
But even top tennis players like Roger Federer are already dealing with artificial intelligence. IBM is also behind this.
Wimbledon, the oldest and arguably most prestigious tournament in the world of tennis, relied on Watson and Artificial Intelligence for its TV broadcast in 2018.
In concrete terms, the aim was to be able to show the viewer decisive moments of the matches, or the emotions of the players, again as a replay in a matter of seconds.
Watson learned to recognize the decisive moments on the basis of existing recordings and to automatically cut them into clips. This explicitly includes the emotions of the players.
AI does this by measuring match data, correctly recognising factors such as gestures, sounds, game play and even spectator behaviour (seen in panning across the audience) in live TV coverage.
AI: For all platforms
In addition to the aforementioned replay clips, the AI also created audiovisual material for the organizer’s websites, apps and so social media.
That sounds trivial at first. It’s not at all. After all, with an average of three matches per court per day recorded by a multitude of cameras, hundreds of hours of audio-visual footage are generated over the two weeks that the Grand Slam tournament lasts. These are impossible for a human to edit or cut within a reasonable period of time.
Whether feature film, documentary, Hollywood or Munich, TV broadcast or social media – artificial intelligence is at our doorstep when it comes to editing videos and films. We just haven’t noticed it yet.
This article was automatically translated into English using AI. If you would like to help us improve the quality, we would be happy to hear from you.