Some forces in nature are difficult to comprehend due to their non-intuitive and abstract nature. Forces driving gyroscopic precession are invisible, yet their effect is very important in a variety of applications, from space navigation to motion tracking. Current technological advancements in haptic interfaces, enables development of revolutionary user interfaces, combining multiple modalities: tactile, visual and auditory. Tactile augmented user interfaces have been deployed in a variety of areas, from surgical training to elementary education. This research provides an overview of haptic user interfaces in higher education, and presents the development and assessment of a haptic-user interface that supports the learner's understanding of gyroscopic precession forces. The visual-haptic simulator proposed, is one module from a series of simulators targeted at complex concept representation, using multi-modal user interfaces. Various higher education domains, from classical physics to mechanical engineering, will benefit from the mainstream adoption of multi-modal interfaces for hands-on training and content delivery. Experimental results are promising, and underline the valuable impact that haptic user interfaces have on enabling abstract concepts understanding, through kinesthetic learning and hands-on practice.
Deep Dive into Kinesthetic Learning -- Haptic User Interfaces for Gyroscopic Precession Simulation.
Some forces in nature are difficult to comprehend due to their non-intuitive and abstract nature. Forces driving gyroscopic precession are invisible, yet their effect is very important in a variety of applications, from space navigation to motion tracking. Current technological advancements in haptic interfaces, enables development of revolutionary user interfaces, combining multiple modalities: tactile, visual and auditory. Tactile augmented user interfaces have been deployed in a variety of areas, from surgical training to elementary education. This research provides an overview of haptic user interfaces in higher education, and presents the development and assessment of a haptic-user interface that supports the learner’s understanding of gyroscopic precession forces. The visual-haptic simulator proposed, is one module from a series of simulators targeted at complex concept representation, using multi-modal user interfaces. Various higher education domains, from classical physics to
Torque-induced precession (i.e., gyroscopic precession) is a physical phenomenon, in which the axis of a spinning object (e.g., a gyroscope) describes a cone in space when an external torque is applied on it. One can feel the precession forces by spinning a wheel, and attempting to modify the spinning axis orientation. Gyroscopes serve a very important function in both simple and highly advanced navigational devices, because precession and angular velocity are integral to modern navigation concepts. From air to sea, these concepts help pilots determine height, depth and various other pieces of information required for safe navigation. Gyroscopes come in a wide variety of forms, from mechanical to optical gyroscopes, from macro to micro-scale, and they are employed in systems for guidance, attitude reference and stabilization, applications for tracking and pointing, as well as flight data analysis (Passaro et al, 2017). Understanding the relationship among gyroscopic precession, angular momentum and angular velocity is a fundamental part of college level physics and engineering education worldwide. Gyroscopic precession, conservation of momentum and other associated abstract concepts, are difficult to understand by freshmen, and faulty mental models can generate confusion in their minds. Many students have difficulty understanding abstract physics and/or mechanical engineering concepts taught using traditional teaching methods. When learners resort to memorization rather than reasoning, they will find it difficult to apply and adapt what they learn to new situations. In the US, the Science, Technology, Engineering and Mathematics (STEM) initiative is targeted at helping learners gain knowledge and hone their reasoning skills. Individual experimentation and observation of force vectors, as well as the simulation of abstract concepts, facilitates and improves the learners' mental models and capacity to understand complex systems. Understanding complex systems and holistic thinking, is an essential skill for engineers (Nelson et al, 2010). Spatial visualization skills and correct judgement of forces are fundamental to a variety of disciplines, but are particularly important for STEM disciplines (Uttal and Cohen, 2012).
Haptic (e.g., force-feedback or vibro-tactile) interfaces have been increasingly used over the past decade to convey tactile information through Haptic-based User Interfaces (HUI). From early stages of education, humans learn to identify various objects and concepts through the sense of touch, as kinesthetic learners hence, it makes sense to augment the visual channel provided by a Graphical User Interface (GUI) with tactile components. Using multimodal interfaces to present abstract concepts, has the potential to increase the learner’s engagement and his understanding capacity. In an effort to improve abstract concept delivery to learners, we propose a haptic enhanced user interface for the simulation of the forces involved in the gyroscopic precession. The cost-effective system was deployed and assessed in a laboratory setup, with the help of a sizable group of volunteers.
The paper is organized as follows: Section 2 provides an overview on related work vis-à-vis haptic interfaces for multimodal content delivery, with an emphasis on haptic systems for simulation and training. In Section 3, the background theoretical concepts associated with Gyroscopic Precession (GP) are presented. Section 4 describes the implementation of the visual-haptic simulator, beginning with the motivation and the goals for the simulator development, followed by the description of the graphical and the haptic user interfaces. Section 5 defines the experimental setup and the participants partitioning. In Section 6, the assessment methodology is presented, and the analysis of the experimental results, followed by the conclusion and closing remarks.
Haptic User Interfaces provide users with cutaneous feedback and/or kinesthetic/force-feedback during interaction with computer generated virtual elements or remote objects manipulation (robotic tele-manipulation). Haptic devices come in a wide variety of forms and shapes, from vibrotactile systems, to complex robotic arms that track the position and orientation of the user’s arms.
Haptic systems development is primarily driven by the medical field (i.e., surgical simulators, complex medical procedures) and the entertainment industry (i.e., video gaming). In the video gaming industry, HUIs have been heavily employed to increase realism by adding the sense of touch. Game development companies (e.g., Electronic Arts) invested heavily in the technology that develops haptic controllers to bring enhanced realism into gaming through “real-pain” sensations (Stone, 2018). Popular games, such as Half-Life 2, support the use of the Novint Falcon (Novint, 2018) haptic devices with a “pistol grip” accessory.
However, even before the spread of haptic systems in the video gaming industr
…(Full text truncated)…
This content is AI-processed based on ArXiv data.