What is it about?

Understanding model predictions is important. Until now, this has focused solely on text and image data. Our work explores explaining video model predictions using a simple emotion classification model. We test our emotion generation framework on a set of video advertisements and demonstrate through quantitative analysis our method, based on LIME, can be applied to any video model classifier.

Featured Image

Why is it important?

Understanding model predictions is becoming crucial as new legislation is being introduced to increase user trust in models. We choose to use a video advertisement dataset for our framework as it is a area growing interest and we believe our system will help creative designers make more appealing and effective advertisements.

Perspectives

As my first paper, I enjoyed the process of developing a novel framework which I truly believe can be used for many applications. I am excited to see where this video explainability technique is used and what impact it will have.

Joachim Vanneste
University of Glasgow

Read the Original

This page is a summary of: Detecting and Explaining Emotions in Video Advertisements, July 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3626772.3657664.
You can read the full text:

Read

Contributors

The following have contributed to this page