What is it about?
There has long been an interest in understanding how we decide when and where to move our eyes, and psychophysical experiments have uncovered many underlying mechanisms. Under controlled laboratory conditions, objects in the scene play an important role in guiding our attention. Due to the visual complexity of the world around us, however, it is hard to assess experimentally how objects influence eye movements when observing dynamic real-world scenes. Computational models have proved to be a powerful tool for investigating visual attention, but existing models are either only applicable to images or restricted to predicting where humans look on average. Here, we present a computational framework for simulating where and when humans decide to move their eyes when observing dynamic real-world scenes.
Featured Image
Photo by v2osk on Unsplash
Why is it important?
Using our framework, we can assess the influence of objects on the model predictions. We find that including object-based attention in the modeling increases the resemblance of simulated eye movements to human gaze behavior, showing that objects play indeed an important role in guiding our gaze when exploring the world around us. We hope that the availability of this framework encourages more research on attention in dynamic real-world scenes.
Perspectives
Read the Original
This page is a summary of: Objects guide human gaze behavior in dynamic real-world scenes, PLoS Computational Biology, October 2023, PLOS,
DOI: 10.1371/journal.pcbi.1011512.
You can read the full text:
Resources
Contributors
The following have contributed to this page