What is it about?
Future VR systems will sense users' mental states to improve interactions. Would combining brain signals and pupil data help decode users' attentional direction? Our results show that multimodality does improve prediction accuracy, and frontal theta power is the most effective EEG feature.
Featured Image
Photo by Ion Fet on Unsplash
Read the Original
This page is a summary of: Multimodal Detection of External and Internal Attention in Virtual Reality using EEG and Eye Tracking Features, September 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3670653.3670657.
You can read the full text:
Contributors
The following have contributed to this page