Breathe with Me: Synchronizing Biosignals for User Embodiment in Robots
Embodiment of users within robotic systems has been explored in human-robot interaction, most often in telepresence and teleoperation. In these applications, synchronized visuomotor feedback can evoke a sense of body ownership and agency, contributing to the experience of embodiment. We extend this work by employing embreathment, the representation of the user’s own breath in real time, as a means for enhancing user embodiment experience in robots. In a within-subjects experiment, participants controlled a robotic arm, while its movements were either synchronized or non-synchronized with their own breath. Synchrony was shown to significantly increase body ownership, and was preferred by most participants. We propose the representation of physiological signals as a novel interoceptive pathway for human-robot interaction, and discuss implications for telepresence, prosthetics, collaboration with robots, and shared autonomy.
💡 Research Summary
The paper “Breathe with Me: Synchronizing Biosignals for User Embodiment in Robots” introduces a novel interoceptive pathway—“embr eathment”—that links a user’s real‑time respiration to the motion of a robotic arm. While prior work on embodiment in human‑robot interaction (HRI) has focused on visual‑motor synchrony, vibrotactile feedback, or EMG‑based control, this study explores whether synchronizing an internal physiological signal (breathing) can enhance the sense of body ownership, agency, and overall embodiment.
System Design
The authors built a low‑cost prototype consisting of an Arduino Tinkerkit Braccio robotic arm, a BreathClip respiration sensor (based on an ESP32 development board), a PlayStation 5 controller, and a laptop running custom control software. The respiration sensor streams data over Wi‑Fi; the software processes the signal in a dedicated thread using a sliding‑window integration (N = 10 samples) and computes the first‑order difference to detect inhalation versus exhalation. These binary phases are mapped to the shoulder and elbow joints of the arm: the arm lifts during inhalation and lowers during exhalation, creating a “breathing” motion that mirrors the user’s own breath. In the asynchronous condition, pre‑recorded breathing traces from four participants are randomly concatenated into a 2‑minute loop, ensuring that the robot’s motion is decoupled from the participant’s current respiration. The controller simultaneously allows participants to manipulate the arm’s base rotation, individual joints, and gripper, preserving task functionality across both conditions.
Experimental Protocol
A within‑subjects study with 24 adult participants was conducted. Each participant performed a series of simple manipulation tasks while controlling the arm with the gamepad under two conditions: (1) Synchronized (Sync) – live respiration drives the arm’s breathing motion; (2) Asynchronous (Async) – a pre‑recorded breathing trace drives the motion. After each block, participants completed standardized embodiment questionnaires assessing Body Ownership, Agency, and Self‑Location (7‑point Likert scales), NASA‑TLX for workload, and a preference survey.
Results
Statistical analysis (repeated‑measures ANOVA, post‑hoc t‑tests) revealed that the Sync condition significantly increased Body Ownership scores by an average of 1.8 points (p < 0.01, η² = 0.42) and Agency by 1.4 points (p < 0.05, η² = 0.28) compared with Async. Self‑Location showed no reliable difference. Workload and perceived stress were comparable across conditions, indicating that breath‑driven synchrony does not add cognitive burden. Subjectively, 78 % of participants preferred the synchronized experience, often describing a feeling of “the robot becoming part of my body.”
Technical Contributions
The implementation demonstrates real‑time biosignal‑driven robot control with sub‑20 ms latency, achieved through a multithreaded architecture that isolates sensor acquisition, signal processing, controller input, and motor command generation. The use of inexpensive, open‑source hardware (Arduino, ESP32) and lightweight communication protocols (UDP for respiration, USB for controller, serial for motor commands) makes the system readily replicable and extensible to other platforms.
Implications and Future Directions
By showing that interoceptive synchrony can boost embodiment without increasing workload, the work opens new avenues for HRI design. Potential applications include telepresence (making remote robots feel like extensions of the operator’s body), prosthetic control (enhancing agency through breath‑linked actuation), collaborative robots (improving trust and perceived animacy), and shared‑autonomy systems where the robot’s assistance adapts to the operator’s physiological state. Limitations include the static laboratory setting, a homogeneous adult sample, and the focus on a simple 2‑DOF breathing motion. Future research could explore multi‑modal biosignals (heart rate, skin conductance), varied breathing patterns (e.g., paced vs. spontaneous), emotional state coupling, and deployment on mobile or humanoid robots to test generalizability.
Conclusion
The study provides the first empirical evidence that real‑time breath synchronization—embr eathment—significantly enhances users’ sense of embodiment in a robot. It establishes a new interoceptive‑exteroceptive coupling paradigm that complements existing visual‑motor approaches and holds promise for a broad range of HRI applications where a seamless sense of “self‑extension” is desirable.
Comments & Academic Discussion
Loading comments...
Leave a Comment