What is it about?
Videos are dynamic and multi-modal compared to other types of content, making automatic filtering difficult, which is why content moderators play a crucial role. However, video content moderators are exposed to more profound emotional labor because videos contain rich visual information, sometimes including even harmful content, such as violent or terrifying scenes. In this work, we explore the effect of six intervention techniques on alleviating negative emotions during video content moderation tasks. We conducted one online crowdsourcing experiment and two controlled user studies to find out that (i) interleaving with positive videos or (ii) cartoonization could significantly reduce negative emotions in the moderators. Participants reported that the advantages of these approaches are in helping reduce negative emotions at the time of moderation while existing approaches focus on post-task activities (e.g., relaxation, talking with others, or getting a hobby). We discuss the applicability of our findings to broader tasks, including improvement in intervention techniques.
Featured Image
Photo by Glenn Carstens-Peters on Unsplash
Why is it important?
We propose four new intervention techniques in addition to the previously presented blurring and grayscaling. Also, we provide evidence on whether the six intervention techniques can be effectively applied to video content while reducing negative emotions, and the specific effects of each intervention technique. We present both quantitative and qualitative analyses showing that interleaving with positive videos or applying cartoonization on task videos significantly reduces negative emotions during video moderation tasks. Also, we summarize the key findings obtained from the experiments and offer practical guidance for enhancing future intervention techniques.
Perspectives
Read the Original
This page is a summary of: Exploring Intervention Techniques to Alleviate Negative Emotions during Video Content Moderation Tasks as a Worker-centered Task Design, July 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3643834.3660708.
You can read the full text:
Resources
Contributors
The following have contributed to this page