Emotion Detection Being Used To Inform Marketing

May 3, 2018 10:05 pm Published by Leave your thoughts

Disney have previously demonstrated the ability to monitor audience reactions to a film. Could emotion detection be used to inform marketing?

Reaction videos started, arguably, when the BBC aired “I love the 1970s” This series looked back on fond memories from years past. Celebrities were then sat in front of a green screen and asked to react to it. Almost 20 years later, this has been rebranded with the internet’s very own: reaction videos.

We’ve also had a number of fan made, fake, reaction videos – let’s look at one of those –

In 2017, Disney and Caltech released a study that reliably and relevantly tracked facial expressions over time. The study uses a something called “factorized variational autoencoder.” It states that the team collected a large data set from audiences watching Disney movies. An infa-red hi-def camera then captured audience motion and faces. From this, the 16+ million data points were then imported in to a neutral network. From the study, the system then moved on to monitoring faces in real time and then using this knowledge to predict an expression at a given moment.

Very interesting.

It got me thinking – could I replicate something similar to this using Micosoft Azure and their video indexing API? Of course. The Azure website allows me to create a video, based on my facial expressions and then perform an analysis. I wanted to just use one subject (me) to put it to the test. One reason for this is that I can then compare it to the emotion I was feeling at the time. This type of analysis, potentially, could be limited by the subject remaining facially emotionless (poker face) but feel an emotion within. The results would be a little meaningless, but I could then compare this to the way I was feeling internally to the results that the video indexer analysed as part of my external emotional reaction.

To begin, I set some basic rules up:

  1. It would be important to select a trailer that I had not previously seen. I definitely, therefore, wouldn’t be picking Justice League or any other DC property (what a shame Aquaman isn’t available yet).
  2. Select a trailer of a film that I was looking forward to viewing, this would then ensure that some type of external emotion would show – and not just my ‘resting bitch face’.
  3. Must be an official trailer and not fan-made.

So, where to start… I needed a trailer. With ‘Avengers: Infinity War’ burning box office records, Marvel is not stopping, and is continuing it’s campaign for marketing Ant-Man 2. So, I went for – Ant-Man 2 (released 02nd May 2018), this felt a relevant and would continue the superhero genre research that I have tended to lean towards.

Having really enjoyed the first Ant-Man film, I am looking forward to viewing this trailer. I hadn’t anything about this film, other than it would include the character “The Wasp”.

So, there was only one thing to do … watch the trailer and record my reaction.

This was the results from the video indexer was as follows:

As you can see, the reaction to the trailer- on my part – is largely positive. The negative emotion that I displayed , during the middle of the trailer, may have been my resting b***h face or a look of confusion and/or interest.

What would be very interesting would be to do this on a larger scale, to record facial reactions over a larger audience. Although the data file would be very large, it would be interesting to compare data over a feature length film rather than a teaser or feature trailer – similar to the Disney / Caltech study.

So is there any scope to take this technology in to distribution and marketing?

Well, yes –

In a previous post, I spoke about how Netflix uses a detailed data network to change the images uses within their application to ensure advertising is targeted to specific audience preferences. What if facial emotion could be analysed, live, through Microsoft Kinect (example) with the UI adapting to audience emotion? After all, I may prefer certain genre of film, series or documentaries but my preference may change depending on the day, time or circumstances of the day. Further, what if my Netflix account was being used but it wasn’t me watching? Or if my family was watching?

One drawback of using this tool with trailers is that the purpose of a trailer is to tease the viewer and to move through a number of scenes in quick motion with the aim of encouraging the viewer to watch the film.
Therefore the emotion may be that of shock, excitement or the viewer may not show external emotion at all – when was the last time you felt angry or sad when watching a trailer?

With data privacy being a hot topic of discussion at the current time, I’m unsure how this tool would be perceived by audiences. This technique may best be used as part of a pilot or test screening with audiences.
The data could be used to gain an insight of the external emotion shown by the various types of audiences: gender, age etc – triangulated with other data sets, including audience profiles, this could give film makers a significant insight to how audiences react whilst, like the Disney study, make predictions for future films.

In conclusion, although emotion does not always show on the face – this technology would support the increased awareness of audience emotion within a larger audience based or to personalise advertising of releases on VOD (eg.) to target audience preferences.

 

Please follow and like us:
Tags: , , , , , , , , , , , , , , , , , ,

Categorised in:

This post was written by noxford

Leave a Reply