MOFU: Development of a MOrphing Fluffy Unit with Expansion and Contraction Capabilities and Evaluation of the Animacy of Its Movements

MOFU: Development of a MOrphing Fluffy Unit with Expansion and Contraction Capabilities and Evaluation of the Animacy of Its Movements
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Robots designed for therapy and social interaction aim to evoke a sense of animacy in humans. While many studies have focused on life like appearance or joint based movements, the effect of whole body volume changing movements commonly observed in living organisms has received little attention. In this study, we developed MOFU MOrphing Fluffy Unit, a mobile robot capable of whole body expansion and contraction using a single motor enclosed in a fluffy exterior. MOFU employs a Jitterbug geometric transformation mechanism that enables smooth diameter changes from approximately 210 mm to 280 mm with a single actuator, and is equipped with a differential two wheel drive mechanism for locomotion. We conducted an online survey using videos of MOFU behaviors and evaluated perceived animacy using the Godspeed Questionnaire Series. First, we compared stationary conditions with and without expansion contraction and with and without rotational motion. Both expansion contraction and rotation independently increased perceived animacy. Second, we examined whether presenting two MOFUs simultaneously would further enhance animacy perception, but no significant difference was observed. Exploratory analyses were also conducted across four dual robot motion conditions. Third, when expansion contraction was combined with locomotion, animacy ratings were higher than for locomotion alone. These results suggest that whole body volume changing movements enhance perceived animacy in robots, indicating that physical volume change is an important design element for future social and therapeutic robots.


💡 Research Summary

This paper introduces MOFU (MOrphing Fluffy Unit), a mobile robot that can change its whole‑body volume by expanding and contracting its diameter from roughly 210 mm to 280 mm using a single actuator. The core of the system is a “Jitterbug” geometric transformation mechanism, originally developed for modular robots, which converts a linear motion of a lead‑screw into a smooth, continuous change in overall body volume. Because the Jitterbug’s natural deformation also induces a small rotation, a differential two‑wheel drive is employed to counteract this effect, allowing the robot to appear to expand and contract without any noticeable spin.

Mechanical design details include a 20 mm lead screw driven by a direct‑drive motor, a quiet linear guide, and a TPU‑printed polyhedral frame that gives the robot a spherical shape. The exterior is covered with a soft, fluffy fabric to emulate animal fur and to reduce perceived mechanical harshness. Electrical architecture consists of three identical direct‑drive motors, motor drivers, a Raspberry Pi Zero 2 W controller, and a 20 Ah lithium‑ion battery. A PID controller (Kp = 5.0, Ki = 10.0, Kd = 0) regulates the actuator position. An infrared reflective distance sensor is mounted on the top surface for future contact‑based interactions, and wheel torque sensors enable simple collision detection. Noise measurements in a typical indoor environment show operation below 50 dB, comparable to a quiet conversation.

A mathematical model of the Jitterbug mechanism is derived, linking the vertical displacement Z to the rotation angle Θ of the top face. The model incorporates geometric parameters (R_A ≈ 56.6 mm, R_B ≈ 46.2 mm, dihedral angle ≈ 54.74°, clearance constant = 13 mm) and a clearance term C to account for manufacturing gaps. Because the closed‑form inverse is analytically intractable, the authors pre‑computed a lookup table of Θ versus Z at 45 evenly spaced points, enabling real‑time control by selecting the nearest Θ for a given target Z. Experimental validation using a webcam with ArUco markers and a precision jig confirmed that the predicted Z‑Θ relationship matches measured values within acceptable error margins.

To assess the perceptual impact of volume change, the authors conducted an online user study with videos of MOFU performing four motion categories: (1) stationary with expansion–contraction, (2) stationary with rotation, (3) dual‑robot presentations (four combinations of the above), and (4) locomotion combined with expansion–contraction. Participants rated each video using the Animacy subscale of the Godspeed Questionnaire Series. Statistical analysis (repeated‑measures ANOVA) revealed:

  • Expansion–contraction alone significantly increased animacy scores compared with a static, non‑changing robot.
  • Rotational motion also independently boosted animacy, and the two effects were additive but not synergistic; the combined condition did not exceed the sum of the individual effects.
  • Presenting two robots simultaneously did not produce higher animacy than a single robot, suggesting that volume change is the dominant cue and that simply adding more agents does not amplify the effect.
  • When expansion–contraction was paired with locomotion, animacy ratings were higher than for locomotion alone, indicating that volumetric dynamics enhance perceived lifelikeness even during movement.

The study’s contributions are threefold: (1) a novel hardware platform that achieves whole‑body volumetric deformation with a single motor, (2) empirical evidence that such deformation positively influences human perception of animacy, and (3) insights into the limits of multi‑robot presentations for this specific cue.

Limitations include the reliance on video‑based evaluation rather than embodied interaction, the lack of integration of other affective channels (color, sound, haptic feedback), and the fact that the contact sensor was not utilized in the reported experiments. Future work should explore multimodal expressive behaviors, real‑world therapeutic trials, and adaptive control strategies that modulate volume change in response to user affect.

Overall, the paper demonstrates that whole‑body volume modulation is a powerful, yet underexplored, design dimension for social and therapeutic robots. By providing a concrete implementation and systematic user evaluation, it opens new avenues for creating robots that are perceived as more “alive,” potentially improving engagement, trust, and therapeutic outcomes in human‑robot interaction contexts.


Comments & Academic Discussion

Loading comments...

Leave a Comment