A computational technique to estimate 3D point of gaze in virtual space
What is it about?
Emerging applications of active 3D technology have raised expectations on the development of more user-friendly stereoscopic 3D content. To optimize stereoscopic 3D content by taking into account human factor, understanding how user gazes at the 3D direction in virtual space is an important research topic. In this paper, we suggest a novel approach to estimate the 3D point of gaze on a stereoscopic display using an optimized geometric method. Dual-camera system and Direct Linear Transformation (DLT) algorithm have been utilized to compute user-dependent parameters and 3D coordinates of the pupil. Instead of asking the user to perform a time-consuming 3D calibration session using more than twenty calibration points, a simplified 3D calibration method using only three calibration points is proposed. Our experimental results show that the proposed method achieves better accuracy compared with a conventional geometric method with average errors 0.83, 0.87, and 1.06cm in X, Y, and Z dimension, respectively. Compared with subjective depth judgment, the proposed gaze tracking system is more robust in measuring the depth of a virtual 3D object with various sizes.
Why is it important?
Our approach is useful for researchers of human-computer interaction field, particularly whom working in virtual reality and gaze-based interaction. Nowadays, VR is going to be more realistic since the 3D content is presented with foveated rendering. For example, Nvidia recently developed a method to render with high resolution only a specific location where the user looks at, while the other parts of the scene were rendered with lower resolution. This foveated rendering technology--hi-res rendering a specific location where the user gazes at--is mainly based on gaze tracking technology. Unfortunately, accurate estimation of the 3D point of gaze is needed to associate what the user look at with the true location where the user looks at. Our method can be used as an alternative approach to validate the 3D gaze tracking system in VR.
The following have contributed to this page: Dr.Eng. Sunu Wibirama