Controlled Flight of an Insect-Scale Flapping-Wing Robot via Integrated Onboard Sensing and Computation

Controlled Flight of an Insect-Scale Flapping-Wing Robot via Integrated Onboard Sensing and Computation
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Aerial insects can effortlessly navigate dense vegetation, whereas similarly sized aerial robots typically depend on offboard sensors and computation to maintain stable flight. This disparity restricts insect-scale robots to operation within motion capture environments, substantially limiting their applicability to tasks such as search-and-rescue and precision agriculture. In this work, we present a 1.29-gram aerial robot capable of hovering and tracking trajectories with solely onboard sensing and computation. The combination of a sensor suite, estimators, and a low-level controller achieved centimeter-scale positional flight accuracy. Additionally, we developed a hierarchical controller in which a human operator provides high-level commands to direct the robot’s motion. In a 30-second flight experiment conducted outside a motion capture system, the robot avoided obstacles and ultimately landed on a sunflower. This level of sensing and computational autonomy represents a significant advancement for the aerial microrobotics community, further opening opportunities to explore onboard planning and power autonomy.


💡 Research Summary

The paper presents a breakthrough in insect‑scale aerial robotics by demonstrating a 1.29‑gram flapping‑wing robot that can hover, track trajectories, avoid obstacles, and land on a target using only onboard sensors and computation. The authors first identify the fundamental gap between biological insects, which rely on a compact, hierarchical sensory system, and existing micro‑air vehicles that depend on off‑board motion‑capture and processing. To close this gap, they adopt a holistic design methodology that simultaneously satisfies four high‑level criteria: (1) precise maneuverability after sensor integration, (2) sensing and computation power consumption below 10 % of total flight power, (3) onboard altitude and attitude accuracy sufficient for higher‑level tasks, and (4) extensibility for future sensors.

Hardware design builds on a prior four‑wing platform, scaling the dielectric elastomer actuators by 1.5× to increase lift per wing from 410 mg to 600 mg, achieving a lift‑to‑weight ratio of 1.9 when carrying a 244‑mg payload. The flight package integrates three ultra‑light components on custom flexible PCBs: a 14‑mg IMU (LSM6DSV80X), a 13‑mg time‑of‑flight (TOF) distance sensor (TMF8806), a 90‑mg optical flow camera (PAA3905E1‑Q), and a 7‑mg dual‑core microcontroller (CY8C6245FNI‑S3D41T). The total sensor‑MCU mass is 244 mg (≈15 % of the robot). The IMU is mounted near the geometric center on a 0.4‑mm fiber‑glass substrate to mitigate vibration and thermal drift; the TOF and flow sensors face downward to measure altitude and lateral motion relative to the ground. Under maximum sensing rates the electronics draw only 120 mW, representing 6 % of the 2.04 W required by the flapping actuators, leaving ample headroom for future power‑autonomous solutions such as solar or laser charging.

The estimation pipeline is deliberately split into three cascaded filters to respect differing sensor update rates and to keep computational load low. First, a Mahony nonlinear complementary filter runs at 480 Hz on IMU data, providing quaternion attitude and gyroscope bias‑corrected angular velocity. Second, a Kalman filter fuses the attitude estimate with TOF range and thrust command to produce altitude and vertical velocity estimates. Third, a separate Kalman filter combines attitude, altitude, and optical‑flow pixel velocities to estimate horizontal velocities, which are integrated to obtain horizontal position. Because each filter only manipulates 2×2 matrices, a single MCU core can complete the full estimation cycle in under 6 ms, allowing a control loop of >100 Hz.

Control is organized hierarchically. The low‑level controller runs at 100 Hz, using PID laws on the estimated states to generate thrust and wing‑phase commands that maintain stable hover and follow setpoints. A high‑level interface runs at ~1 Hz, accepting operator‑issued waypoints; a simple trajectory generator interpolates these commands for the low‑level loop. This separation enables a human operator to direct the robot in real time while the onboard controller handles fast dynamics.

Experimental validation consists of two phases. In the first, onboard sensing is used but state estimation and control are performed offboard on a desktop computer; motion‑capture (Vicon) provides ground truth. A 12‑second hover at 5 cm altitude yields RMS lateral position error of 3.24 cm and altitude error of 0.25 cm, with the estimator alone contributing 1.8 cm lateral and 0.1 cm altitude errors. In the second phase, the entire pipeline runs on the onboard MCU. A 30‑second flight outside the motion‑capture arena demonstrates autonomous obstacle avoidance and precise landing on a sunflower, with total drift remaining under 2 cm. RMS positional error across the flight is 3.96 cm, and attitude error stays within 1.8°.

The authors conclude that their integrated sensor‑estimator‑controller architecture achieves centimeter‑scale positioning and sub‑degree attitude accuracy while consuming only a fraction of the robot’s power budget. This represents the first insect‑scale flapping‑wing platform capable of fully autonomous flight without external infrastructure, opening the door to future work on onboard power generation, visual navigation, and mission‑level autonomy for applications such as pollination, search‑and‑rescue, and precision agriculture.


Comments & Academic Discussion

Loading comments...

Leave a Comment