What is it about?
Digital experiments are routinely used to test the value of a treatment relative to a status quo control setting. The treatment effect could be different across major sub-groups based on user characteristics. We propose a framework for detecting and analyzing these heterogeneities in treatment effects in digital experiments. Analysis of 27 real-world experiments spanning 1.76 billion sessions and simulated data demonstrates the effectiveness of our framework relative to existing techniques.
Featured Image
Photo by Markus Spiske on Unsplash
Why is it important?
As digital experiments have become increasingly pervasive in organizations and a wide variety of research areas, their growth has prompted new challenges for experimentation platforms. One challenge is that experiments often focus on the average treatment effect (ATE) without explicitly considering differences across major sub-groups — heterogeneous treatment effect (HTE). This is especially problematic because ATEs have decreased in many organizations as the more obvious benefits have already been realized.
Perspectives
Read the Original
This page is a summary of: Examining User Heterogeneity in Digital Experiments, ACM Transactions on Information Systems, January 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3578931.
You can read the full text:
Contributors
The following have contributed to this page