Movie studios have traditionally tested new films to target audiences in an attempt to see how audiences react before the releases of a film. In a previous article, I looked at how the Microsoft Azure platform can be used to identify emotional reaction to trailers and how Disney were using AI to evaluate audience reaction.
In this article, I set out to identify facial emotion within official film trailers.
As an alternative to the Microsoft Azure platform, real-time face detection classification was performed using ‘fer2013’ dataset with a keras CNN model and openCV libraries. The model identified the emotion (angry, disgust, fear, happy, sad, surprise, neutral) using a probability factor within a given trailer.
I introduced a simple timer within Python that recorded each emotional response for analysis and was then able to use graphical representation to illustrate trends within the data.
I thought it was best to stay away from The Batman, as the cowl will have hidden Bruce Wayne’s emotions or animated characters, which may not represent human identity, to support detection of emotion within the trailer. So, keeping with the Superhero theme – I went for Shazam! Trailer 2.
The synopsis for Shazam! reads as:
Billy Batson is a streetwise 14-year-old who can magically transform into the adult superhero Shazam simply by shouting out one word. His newfound powers soon get put to the test when he squares off against the evil Dr. Thaddeus Sivana.
Let’s take a look at the trailer:
The images below show a sample of the emotions that were detected during the trailer. Although some of the emotion picked up were not used (after data cleaning), it is clear that this automated process does a reasonable job in matching facial emotion to what we recognise as anger, disgust, fear, happiness, sadness, surprise, and neutral.
The emotions detected within the trailer, following a linear timeline, clearly stereotype the protagonist (Shazam) and side-kick (Freddy Freeman) as happy characters, after an initial unknown and fear of the newly found powers. Likewise, the main antagonist (Dr. Thaddeus Sivana) shows clear anger throughout his appearances. As the audience watching this trailer, which is over 2 minutes in length, you are shown – along with other emotive cues – the basic emotions of each character.
The percentages of the emotions detected within the trailer are:
The outcome shows that over 56% of the emotion shown within the trailer represents a happy emotion. There were scenes with anger, fear and sadness – representing over 15% of the trailer.Although I have not tested this, I expect this to be in contrast to that of previous DCEU films – such as Batman v Superman.
Rotten Tomatoes has recently given the Shazam! film the following critic rating:
an effortlessly entertaining blend of humour and heart, Shazam! is a superhero movie that never forgets the genre’s real power: joyous wish fulfilment.
It is clear that Warner Bros / DC Films wanted to portray ‘happy trailer’ to audiences. Whether this theme is continued within the feature film is something for another article.
In future articles, I will continue to identify, compare and contrast elements within film trailers. This will include the number of scenes within a given trailer and how the differ from genre to genre. I’d also be interested in completing a wider study that identifies that last trailer audiences remember and the content of these.
- Model created based on research paper: https://github.com/oarriaga/face_classification/blob/master/report.pdf
- Emotion detection accuracy, based on above paper, is 65%. Emotion probability (out of 100) was set at +70% when recorded, this was to support the ‘clearning’ of the data.
- Use of glasses (or any covering element such as: hair, scarf, hat etc) may affect emotion detection. For example, the label ‘angry’ will be activated when it believes a person is frowning, these could get confused with a person that is wearing dark glass frames.
- Emotion is more than just a face – what about object detection within the trailer? Of course, emotion can be internalised. However, sound, colour palette, and objects used also have an impact on how the audiences are feeling and respond.
Discussion – The Future of Automating The Trailer Process:
In 2016, 20th Century Fox called in IBM Watson, the supercomputer, to create a trailer for it’s AI Horror / Thriller movie; Morgan. AI Inception.
IBM Watson had previously performed many tasks including winning a number of contests against human contestants in quiz shows, creating bespoke recipes and describing the contents of photos. In this case, 20th Century Fox used IBM Watson to produce a film trailer for their new sci-fi film.
IBM researchers fed Watson more than 100 different film trailers that were cut into separate moments and scenes. After completing an analysis of the visuals, audio and compositions it processed 90 minutes of Morgan to find the right moments to use within the trailer.
Once the computer had processed the data, it returned 10 scenes in which to use for the trailer. Human editing was still required to stitch the scenes together, but the process was reduced from 10 – 30 days to just 24 hours.
What are your thoughts? Do you think that this process can be automated?
- Box Office
- Film Festivals and Awards
- Sales and Distribution
This post was written by noxford