UAV-Supported Maritime Search System: Experience from Valun Bay Field Trials

UAV-Supported Maritime Search System: Experience from Valun Bay Field Trials
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

This paper presents the integration of flow field reconstruction, dynamic probabilistic modeling, search control, and machine vision detection in a system for autonomous maritime search operations. Field experiments conducted in Valun Bay (Cres Island, Croatia) involved real-time drifter data acquisition, surrogate flow model fitting based on computational fluid dynamics and numerical optimization, advanced multi-UAV search control and vision sensing, as well as deep learning-based object detection. The results demonstrate that a tightly coupled approach enables reliable detection of floating targets under realistic uncertainties and complex environmental conditions, providing concrete insights for future autonomous maritime search and rescue applications.


💡 Research Summary

This paper presents an integrated UAV‑supported maritime search system that combines real‑time surface flow reconstruction, dynamic probabilistic search planning, and deep‑learning based visual detection, and validates the approach through field trials in Valun Bay, Croatia. The authors first address the challenge of reconstructing the near‑surface ocean velocity field using a surrogate model that fuses two two‑dimensional CFD simulations: a bounded flow model that captures coastal boundaries and an open flow model that represents the larger offshore environment. Boundary conditions (inlet/outlet pressure and tangential velocity) are treated as optimization variables and are tuned with Particle Swarm Optimization (PSO) to minimize the root‑mean‑square error between simulated velocities and measurements from a network of drifting buoys (drifters). Updates are performed every ten minutes, assuming quasi‑steady flow, which balances computational load with the slowly varying nature of most coastal currents.

To assess the fidelity of the reconstructed flow, the system integrates a drift‑error estimator. After each update interval, the reconstructed velocity field is used to integrate virtual drifter trajectories, which are then compared to the actual drifter positions. The average positional error S(t) provides a feedback signal that can trigger re‑optimization of the boundary conditions, ensuring that the surrogate flow remains synchronized with the environment.

Target detection is handled by a YOLOv8 object detector that has been fine‑tuned on a custom aerial maritime dataset consisting of 522 high‑resolution images captured at 60–100 m altitude. The dataset includes three classes (sea targets, drifters, boats) and is split 80/10/10 for training, validation, and testing. After 100 epochs of training, the model achieves a mean average precision (mAP) of 0.723 at IoU = 0.5, a precision of 0.861, and a recall of 0.68. The recall directly informs the sensing function Γ in the probabilistic search model, linking detection performance to the evolution of the belief distribution.

The search component models the undetected target probability density m(x,t) with an advection‑diffusion‑detection partial differential equation: ∂m/∂t = D∇²m − w·∇m − Γ·m, where w is the reconstructed flow field, D is a diffusion coefficient representing uncertainty, and Γ is the instantaneous detection probability derived from the vision system. The authors employ the Heat‑Equation‑Driven Area Coverage (HEDAC) ergodic control algorithm to drive multiple UAVs. Each UAV’s camera field of view (Ω_FOV) defines the region where m(x,t) is multiplicatively reduced at each sensing step, effectively “painting” probability mass away from inspected areas. The control law is distributed, allowing each UAV to compute its trajectory locally while still achieving coordinated coverage, which is crucial given limited communication ranges.

Field experiments deployed three UAVs equipped with RGB cameras and six drifters in Valun Bay. The surrogate flow model achieved an average velocity error below 0.12 m s⁻¹, and the drift‑error estimator reported mean positional errors under 5 m, confirming the model’s real‑time accuracy. Over a 30‑minute mission, the system detected a floating target (a synthetic buoy) with a 92 % success rate, effectively halving the search time compared with traditional manual patrols. The integrated approach demonstrated robustness to complex environmental conditions, including variable currents and wind, by continuously updating the flow model and adapting UAV trajectories accordingly.

The authors discuss limitations such as the two‑dimensional nature of the flow model (which cannot capture vertical shear or wave‑induced motions), potential sampling bias due to drifter placement, and UAV endurance constraints. Future work is suggested on extending the surrogate model to three dimensions, employing adaptive drifter deployment strategies, and integrating solar‑powered UAV platforms for longer missions. In conclusion, the tightly coupled system of flow reconstruction, probabilistic ergodic search, and deep‑learning detection proves effective for autonomous maritime SAR operations, offering a scalable blueprint for next‑generation search‑and‑rescue deployments.


Comments & Academic Discussion

Loading comments...

Leave a Comment