What is it about?

The authors designed a toolkit to make social interactions in virtual reality (VR) accessible to people with different sensory abilities, called SocialCueSwitch. In VR, social cues like gestures, eye contact, and proximity are typically visual or auditory, excluding people who are blind or have low vision (BLV) and those who are deaf or hard of hearing (DHH). SocialCueSwitch allows these cues to be represented in various ways, including haptic feedback (touch), so users can choose the best methods for their needs. For example, spatial audio can indicate who is speaking for BLV users, while visual indicators or captions can help DHH users. Developed for easy integration into existing VR systems, this toolkit aims to make VR more inclusive and customizable, enhancing social interactions for everyone.

Featured Image

Why is it important?

SocialCueSwitch addresses the accessibility of virtual reality (VR) environments by focusing on social cues. While many existing accessibility tools in VR address literal cues, such as identifying colors or reading text, SocialCueSwitch is different by translating social interactions—like gestures, eye contact, and proximity—into multiple sensory modalities, including haptic feedback. This approach allows users, especially those who are blind or have low vision (BLV) and those who are deaf or hard of hearing (DHH), to receive feedback on these crucial social cues in ways that suit their sensory preferences. By enabling such nuanced and customizable interactions, SocialCueSwitch not only makes VR more inclusive but also enhances the social experience for a diverse range of users, providing clearer feedback in multiple modalities. This system may also benefit those who would like clearer indications of social cues, such as those who are neurodiverse.

Perspectives

Writing this article was a great pleasure, as it was great to work with such a talented team of researchers. Virtual reality (VR) is such a fascinating technology, and contributing to its accessibility is rewarding. Our team shared a vision of making VR experiences more accessible to everyone, including those who are blind or have low vision (BLV) and those who are deaf or hard of hearing (DHH). The idea that more people can enjoy and benefit from the unique affordances of VR is really cool and a motivating factor of this work. This project has reinforced my belief in the importance of inclusivity in technology, and I am excited about the possibilities that SocialCueSwitch opens up for making VR a more inclusive space.

Jonathan Isaac Segal
Cornell University

Read the Original

This page is a summary of: SocialCueSwitch: Towards Customizable Accessibility by Representing Social Cues in Multiple Senses, May 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3613905.3651109.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page