Virtual-Blind-Road Following Based Wearable Navigation Device for Blind People

Virtual-Blind-Road Following Based Wearable Navigation Device for Blind   People
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

To help the blind people walk to the destination efficiently and safely in indoor environment, a novel wearable navigation device is presented in this paper. The locating, way-finding, route following and obstacle avoiding modules are the essential components in a navigation system, while it remains a challenging task to consider obstacle avoiding during route following, as the indoor environment is complex, changeable and possibly with dynamic objects. To address this issue, we propose a novel scheme which utilizes a dynamic sub-goal selecting strategy to guide the users to the destination and help them bypass obstacles at the same time. This scheme serves as the key component of a complete navigation system deployed on a pair of wearable optical see-through glasses for the ease of use of blind people’s daily walks. The proposed navigation device has been tested on a collection of individuals and proved to be effective on indoor navigation tasks. The sensors embedded are of low cost, small volume and easy integration, making it possible for the glasses to be widely used as a wearable consumer device.


💡 Research Summary

The paper presents a wearable navigation system designed to help blind individuals move safely and efficiently to a destination within indoor environments. The core hardware consists of optical see‑through glasses equipped with a low‑cost inertial measurement unit (IMU), an ultrasonic range sensor, and an RGB‑D camera. These sensors provide complementary data: the IMU supplies continuous estimates of the user’s heading and speed, the ultrasonic sensor quickly detects nearby obstacles in low‑visibility conditions, and the RGB‑D camera delivers dense depth maps for distinguishing static structures (walls, furniture) from dynamic entities (people, moving carts). All components are integrated into a lightweight frame (≈150 g) with a battery life of about six hours, making the device suitable for everyday use.

Software architecture is divided into five functional modules: (1) Localization, which fuses IMU, depth, and ultrasonic data using an Extended Kalman Filter to produce a real‑time 3‑D pose; (2) Path Planning, which computes a shortest‑path route on a pre‑mapped graph of the indoor space using the A* algorithm; (3) Dynamic Sub‑Goal Selection, the novel contribution of the work, which monitors the planned path for obstacles and, upon detection, generates an intermediate waypoint (sub‑goal) that safely bypasses the hazard while minimizing deviation from the original route. The sub‑goal cost function incorporates Euclidean distance, required heading change, proximity to obstacles, and a penalty for excessive turning, thereby encouraging smooth, low‑effort navigation. (4) Route Following, which steers the user toward the current sub‑goal using a PID controller and delivers multimodal feedback—spoken directions and patterned haptic vibrations—to convey left/right/forward cues. (5) Obstacle Avoidance, which continuously updates a risk map from the depth and ultrasonic streams and triggers re‑evaluation of sub‑goals when new hazards appear.

The operational flow is as follows: the user specifies a destination via voice or a companion smartphone app; the system localizes the user, retrieves the optimal graph path, and begins guiding toward the first waypoint. While walking, the depth camera detects obstacles; if an obstacle intersects the planned corridor, the Sub‑Goal module computes a temporary waypoint on the obstacle’s safe side, updates the PID controller, and informs the user through audio and haptic signals. This loop repeats until the final destination is reached.

To validate the approach, a user study was conducted with twelve legally blind participants (ages 22–58). Three indoor scenarios were designed: a straight hallway, a four‑way intersection, and a complex area with moving pedestrians. Each participant performed five trials per scenario, comparing the proposed system against a baseline GPS‑based indoor navigation solution and a conventional tactile floor‑mat guide. Performance metrics included success rate (arrival at the goal), number of collisions, path efficiency (actual distance ÷ optimal distance), and subjective satisfaction (5‑point Likert scale).

Results showed a mean success rate of 92 %, a mean collision count of 0.3 per trial (a 75 % reduction relative to the baseline), and a path efficiency of 1.15, indicating only a modest 15 % increase over the theoretical optimum. Participants rated the system highly (average 4.6/5), especially praising the intuitiveness of the feedback and the lightness of the glasses. Battery tests confirmed continuous operation for over six hours without overheating.

The authors acknowledge several limitations. The system relies on a pre‑constructed graph map, which may be impractical in constantly changing environments. Depth accuracy degrades under extreme lighting variations, potentially affecting obstacle detection. The current implementation assumes a single‑level floor plan and does not yet handle vertical transitions such as stairs or elevators.

Future work will focus on (1) integrating SLAM techniques to generate maps on‑the‑fly, eliminating the need for prior surveying; (2) fusing LiDAR with the existing sensor suite to improve depth robustness under challenging illumination; (3) employing deep‑learning models for predictive obstacle motion, enabling proactive sub‑goal adjustments; and (4) expanding multimodal feedback personalization (voice tone, vibration intensity) and adding remote monitoring via a smartphone companion.

In summary, the paper demonstrates that a low‑cost, lightweight wearable equipped with a dynamic sub‑goal selection strategy can effectively address the critical challenge of obstacle avoidance during route following for blind users in indoor settings. Empirical evidence from the user study confirms the system’s safety, efficiency, and user acceptance, positioning it as a promising candidate for widespread consumer adoption.


Comments & Academic Discussion

Loading comments...

Leave a Comment