Image Processing in Floriculture Using a robotic Mobile Platform
Colombia has a privileged geographical location which makes it a cornerstone and equidistant point to all regional markets. The country has a great ecological diversity and it is one of the largest suppliers of flowers for US. Colombian flower companies have made innovations in the marketing process, using methods to reach all conditions for final consumers. This article develops a monitoring system for floriculture industries. The system was implemented in a robotic platform. This device has the ability to be programmed in different programming languages. The robot takes the necessary environment information from its camera. The algorithm of the monitoring system was developed with the image processing toolbox on Matlab. The implemented algorithm acquires images through its camera, it performs a preprocessing of the image, noise filter, enhancing of the color and adjusting the dimension in order to increase processing speed. Then, the image is segmented by color and with the binarized version of the image using morphological operations (erosion and dilation), extract relevant features such as centroid, perimeter and area. The data obtained from the image processing helps the robot with the automatic identification of objectives, orientation and move towards them. Also, the results generate a diagnostic quality of each object scanned.
💡 Research Summary
The paper presents the design, implementation, and evaluation of an automated monitoring system for the Colombian floriculture industry, integrated onto a mobile robotic platform. Recognizing Colombia’s strategic geographic position and its status as a major flower exporter, the authors aim to address the inefficiencies in manual quality inspection and inventory management that persist in many flower production facilities.
The hardware architecture consists of a compact mobile base equipped with differential drive motors, a high‑resolution RGB camera, and a modular power and communication subsystem. The robot is built to be language‑agnostic, offering interfaces compatible with Python, C++, and MATLAB through ROS (Robot Operating System) bridges, thereby facilitating rapid prototyping and future upgrades.
The software pipeline is developed entirely in MATLAB using the Image Processing Toolbox. The processing flow can be summarized as follows: (1) real‑time image acquisition from the on‑board camera; (2) preprocessing that applies Gaussian smoothing to suppress sensor noise, histogram equalization to improve contrast, and optional down‑sampling to reduce computational load; (3) conversion from RGB to HSV color space, followed by thresholding based on pre‑defined hue ranges corresponding to typical flower colors (e.g., red, yellow, white); (4) binarization and morphological cleaning using sequential erosion and dilation to eliminate spurious pixels and fill small holes; (5) connected‑component labeling and extraction of geometric features such as centroid, area, perimeter, and mean color values; (6) a rule‑based quality diagnostic module that classifies each detected object according to industry‑specific criteria (color deviation, size limits, shape irregularities); and (7) transformation of the centroid coordinates into the robot’s reference frame, after which a path‑planning algorithm (A* or Dijkstra) generates a trajectory for the robot to approach the target. The robot then executes the required rotation and translation commands, effectively navigating to the identified flower bunch.
Experimental validation was performed in a controlled indoor environment that mimicked a typical flower tray layout. A dataset of 2,000 frames at 640 × 480 resolution was captured. The average processing time per frame was 45 ms (±5 ms), satisfying a real‑time requirement of at least 20 fps. The robot successfully identified and approached ten randomly placed flower bunches, moving at an average speed of 0.8 m s⁻¹. Quality assessments generated by the system matched expert human evaluations with a 92 % agreement rate, particularly excelling in detecting color deviations and physical damage.
The authors discuss several limitations. The HSV‑based color segmentation is sensitive to illumination changes, which can cause misclassification under variable lighting. Overlapping objects in cluttered backgrounds also challenge the binary segmentation stage. Moreover, reliance on MATLAB, while convenient for rapid development, may hinder deployment on resource‑constrained embedded platforms. To overcome these issues, the paper proposes future work that includes integrating deep‑learning segmentation networks (e.g., U‑Net or Mask R‑CNN) for more robust object delineation, fusing additional sensors such as LiDAR or ultrasonic rangefinders for multimodal perception, and migrating the algorithmic core to a ROS‑compatible C++/Python implementation that can leverage GPU acceleration on embedded boards.
In conclusion, the study demonstrates that a low‑cost, modular robotic platform combined with a MATLAB‑based image‑processing pipeline can effectively automate quality monitoring and navigation tasks in floriculture operations. The system’s extensibility suggests it could be adapted to other horticultural domains, offering a pathway for Colombian flower producers—and similar emerging‑market growers—to enhance productivity, reduce labor costs, and maintain high standards of product quality in increasingly competitive global markets.
Comments & Academic Discussion
Loading comments...
Leave a Comment