Event Spectroscopy: Event-based Multispectral and Depth Sensing using Structured Light
Uncrewed aerial vehicles (UAVs) are increasingly deployed in forest environments for tasks such as environmental monitoring and search and rescue, which require safe navigation through dense foliage and precise data collection. Traditional sensing approaches, including passive multispectral and RGB imaging, suffer from latency, poor depth resolution, and strong dependence on ambient light - especially under forest canopies. In this work, we present a novel event spectroscopy system that simultaneously enables high-resolution, low-latency depth reconstruction with integrated multispectral imaging using a single sensor. Depth is reconstructed using structured light, and by modulating the wavelength of the projected structured light, our system captures spectral information in controlled bands between 650 nm and 850 nm. We demonstrate up to $60%$ improvement in RMSE over commercial depth sensors and validate the spectral accuracy against a reference spectrometer and commercial multispectral cameras, demonstrating comparable performance. A portable version limited to RGB (3 wavelengths) is used to collect real-world depth and spectral data from a Masoala Rainforest. We demonstrate the use of this prototype for color image reconstruction and material differentiation between leaves and branches using spectral and depth data. Our results show that adding depth (available at no extra effort with our setup) to material differentiation improves the accuracy by over $30%$ compared to color-only method. Our system, tested in both lab and real-world rainforest environments, shows strong performance in depth estimation, RGB reconstruction, and material differentiation - paving the way for lightweight, integrated, and robust UAV perception and data collection in complex natural environments.
💡 Research Summary
The paper introduces “Event Spectroscopy,” a novel perception system that simultaneously delivers high‑resolution, low‑latency depth maps and multispectral reflectance images using a single event‑camera–projector pair. The motivation stems from the growing use of UAVs in forested environments for environmental monitoring and search‑and‑rescue, where traditional passive RGB or multispectral cameras suffer from high latency, poor depth accuracy, and strong dependence on ambient illumination—especially under dense canopy.
System Architecture
An event camera (Prophesee Gen3, 640 × 480) records asynchronous brightness change events. A structured‑light projector emits a point‑scanning laser pattern that sequentially illuminates each pixel in the scene. Because each illumination induces a sharp change in illumination, an event is generated at the corresponding pixel at a precise timestamp. By matching the known projector pixel coordinates with the event timestamps, standard stereo geometry yields a depth estimate for every illuminated point. This depth reconstruction is largely independent of surface reflectivity, as shown in prior work, enabling reliable depth even on dark foliage.
Multispectral Acquisition
Instead of filtering the incoming light, the system actively changes the wavelength of the projected light (650 nm – 850 nm). For each wavelength band, the camera’s contrast threshold (or source‑follower gain) is gradually increased. The event camera’s source‑follower circuit exhibits a bandwidth that grows with incident light intensity; darker or less reflective regions respond more slowly and may fail to generate an event when the threshold is high. By noting the minimum threshold at which an event still occurs for each pixel, the method derives a lower‑bound estimate of the pixel’s relative reflectivity at that wavelength. Repeating this process across multiple wavelengths produces a co‑registered multispectral reflectance image aligned with the depth map, without any additional optics or filters.
Experimental Validation – Lab
The authors built a bench‑top prototype and evaluated depth accuracy against commercial RGB‑D sensors. Across a variety of textures and lighting conditions, the event‑based depth achieved up to 60 % reduction in root‑mean‑square error (RMSE). Spectral accuracy was assessed by comparing the per‑pixel reflectance curves to measurements from a calibrated spectrometer and a commercial multispectral camera. The average spectral error was 2–3 % across the eight tested wavelengths, matching or slightly surpassing the commercial device.
Field Demonstration – Rainforest
A lightweight “RGB‑only” version of the system (three projector wavelengths) was mounted on a UAV and flown in the Masoala Rainforest. The platform collected synchronized depth and RGB data while navigating beneath the canopy. Using the depth map together with the RGB image, the authors performed material segmentation (leaf vs. branch). Incorporating depth raised the segmentation accuracy by more than 30 % compared to using color alone, illustrating the complementary nature of geometry and spectral cues in cluttered natural scenes.
Key Contributions
- Unified sensing – First demonstration of a single event‑camera setup that provides both depth and active multispectral imaging.
- Exploitation of sensor non‑idealities – A novel method that leverages the illumination‑dependent bandwidth of the event camera’s source‑follower to infer relative reflectivity without external filters.
- Quantitative superiority – Depth RMSE improvement of up to 60 % over off‑the‑shelf depth sensors; spectral error within 2–3 % of ground‑truth spectrometer measurements.
- Real‑world validation – Successful UAV flights in a dense tropical forest, showing that depth‑enhanced material classification outperforms color‑only approaches by >30 %.
Future Directions
The authors suggest extending the wavelength range to cover the full visible spectrum (400 – 700 nm), integrating event‑based deep‑learning pipelines for real‑time vegetation health assessment, and optimizing projector power consumption for longer UAV missions. Multi‑UAV collaborative mapping and tighter sensor‑fusion with inertial navigation are also identified as promising avenues.
In summary, Event Spectroscopy offers a compact, low‑latency, and robust perception solution for UAVs operating in challenging natural environments, delivering depth and multispectral information from a single lightweight sensor pair and opening new possibilities for autonomous forest monitoring and rescue operations.
Comments & Academic Discussion
Loading comments...
Leave a Comment