Foveated Haptic Gaze

Foveated Haptic Gaze
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

As digital worlds become ubiquitous via video games, simulations, virtual and augmented reality, people with disabilities who cannot access those worlds are becoming increasingly disenfranchised. More often than not the design of these environments focuses on vision, making them inaccessible in whole or in part to people with visual impairments. Accessible games and visual aids have been developed but their lack of prevalence or unintuitive interfaces make them impractical for daily use. To address this gap, we present Foveated Haptic Gaze, a method for conveying visual information via haptics that is intuitive and designed for interacting with real-time 3-dimensional environments. To validate our approach we developed a prototype of the system along with a simplified first-person shooter game. Lastly we present encouraging user study results of both sighted and blind participants using our system to play the game with no visual feedback.


💡 Research Summary

This paper addresses the critical issue of accessibility in digital virtual worlds (video games, simulations, VR/AR), which are predominantly designed around visual interaction, thereby excluding people with visual impairments. To bridge this gap, the authors propose “Foveated Haptic Gaze (FHG),” a novel sensory substitution method that intuitively conveys visual information through haptics, specifically designed for real-time interaction with 3D environments.

The core innovation of FHG lies in its biomimetic design, inspired by the human visual system’s foveated vision. Foveated vision combines high-acuity central focus (the fovea) with broad, low-detail peripheral awareness. The FHG system replicates this dual-channel approach through two integrated haptic interfaces: 1) A haptic glove equipped with fingertip vibration motors. By pointing their hand, a user directs their “gaze.” When this gaze intersects with an object in the virtual environment, the object’s identity (e.g., monster, explosive barrel) is communicated via distinct vibration patterns on the glove, analogous to high-detail foveal vision. 2) A haptic back display (a chair embedded with vibration actuators). This display maps the entire field of view of the user’s avatar, providing continuous vibrotactile feedback about the presence and coarse relative location of all objects within that field. This serves as “haptic peripheral vision,” granting broad situational awareness. Crucially, the system also displays the current position of the user’s gaze (hand direction) on this back display. This allows users to spatially relate their focus to their periphery, enabling them to “look at” an object by aligning the gaze vibration with the object’s location vibration on their back—mimicking the natural act of shifting gaze from periphery to focus.

To validate the system’s efficacy for complex, real-time interaction, the researchers developed a simplified first-person shooter (FPS) game using the ViZDoom platform. The game environment consisted of 10 connected rooms populated with monsters (targets) and explosive barrels (to be avoided). All visual feedback was disabled; players had to rely entirely on the FHG system to navigate, identify objects, and aim/shoot using a trigger on the haptic glove.

A user study was conducted with five participants with visual impairments and ten sighted participants. The study aimed to evaluate the learnability and effectiveness of the FHG interface and to observe the impact of prior visual experience on using a non-visual spatial interface. Participants were first trained on the core concepts: using hand movement to control gaze, interpreting the haptic peripheral vision from the back display, and identifying objects via the glove’s vibration codes. They then played the game, with their score (monsters shot minus barrels shot) serving as the performance metric.

The results indicated that both blind and sighted participants could successfully learn and use the FHG system to play the FPS game without any visual feedback. This demonstrates the system’s intuitive design and its potential to enable meaningful interaction with dynamic 3D environments. The paper concludes that Foveated Haptic Gaze presents a promising new paradigm for accessibility, offering an intuitive, exploratory, and spatially coherent method for people with visual impairments to access increasingly prevalent virtual and augmented reality worlds. The work lays the foundation for future development of more sophisticated assistive technologies that can generalize to real-world interactive visual scenarios.


Comments & Academic Discussion

Loading comments...

Leave a Comment