FloraForge: LLM-Assisted Procedural Generation of Editable and Analysis-Ready 3D Plant Geometric Models For Agricultural Applications

Reading time: 5 minute
...

📝 Original Info

  • Title: FloraForge: LLM-Assisted Procedural Generation of Editable and Analysis-Ready 3D Plant Geometric Models For Agricultural Applications
  • ArXiv ID: 2512.11925
  • Date: 2025-12-11
  • Authors: Mozhgan Hadadi, Talukder Z. Jubery, Patrick S. Schnable, Arti Singh, Bedrich Benes, Adarsh Krishnamurthy, Baskar Ganapathysubramanian

📝 Abstract

Accurate 3D plant models are crucial for computational phenotyping and physics-based simulation; however, current approaches face significant limitations. Learning-based reconstruction methods require extensive species-specific training data and lack editability. Procedural modeling offers parametric control but demands specialized expertise in geometric modeling and an in-depth understanding of complex procedural rules, making it inaccessible to domain scientists. We present FloraForge, an LLM-assisted framework that enables domain experts to generate biologically accurate, fully parametric 3D plant models through iterative natural language Plant Refinements (PR), minimizing programming expertise. Our framework leverages LLM-enabled co-design to refine Python scripts that generate parameterized plant geometries as hierarchical B-spline surface representations with botanical constraints with explicit control points and parametric deformation functions. This representation can be easily tessellated into polygonal meshes with arbitrary precision, ensuring compatibility with functional structural plant analysis workflows such as light simulation, computational fluid dynamics, and finite element analysis. We demonstrate the framework on maize, soybean, and mung bean, fitting procedural models to empirical point cloud data through manual refinement of the Plant Descriptor (PD), human-readable files. The pipeline generates dual outputs: triangular meshes for visualization and triangular meshes with additional parametric metadata for quantitative analysis. This approach uniquely combines LLM-assisted template creation, mathematically continuous representations enabling both phenotyping and rendering, and direct parametric control through PD. The framework democratizes sophisticated geometric modeling for plant science while maintaining mathematical rigor.

💡 Deep Analysis

Figure 1

📄 Full Content

Accurate three-dimensional (3D) plant geometric models are foundational to modern plant science, enabling computational phenotyping, physics-based simulation of agroecosystems, and functional-structural modeling of growth processes [1][2][3][4]. However, generating such models from scratch (forward modeling) or finding a 3D model representation for captured data (reconstruction) remains a significant bottleneck: learning-based reconstruction methods require extensive species-specific training datasets and produce representations that resist direct editing, while procedural modeling systems demand specialized expertise in geometric programming and rule-based algorithms that few plant scientists possess. So-called inverse procedural models often represent only a class of objects and fail on detailed geometries [5][6][7]. This fundamental tension between accessibility and analytical utility has constrained the adoption of sophisticated 3D modeling in plant phenotyping, breeding, and agronomic research.

Current approaches face three fundamental limitations that prevent widespread adoption. First, learning-based reconstructions [8][9][10] yield mesh or latent representations that lack parametric editability, thereby preventing researchers from modifying biologically meaningful traits, such as leaflet pitch, internode elongation, or organ curvature, for hypothesis-driven studies. Second, procedural modeling frameworks such as L-systems [11,12] and template-based systems [13,14] require deep expertise in programming the grammar rules and geometric modeling, making them inaccessible to domain scientists who lack specialized training in 3D graphics or geometric modeling. Moreover, existing procedural models are complex, non-linear systems that are difficult to control. Third, most existing reconstruction methods are suited for artificial objects, such as CAD models (e.g., [15,16]), and fail on thin and long geometries, which is a typical case for vegetation. These limitations force researchers to choose between accessible but uneditable and approximate reconstructions and powerful yet inaccessible procedural tools, a choice that ultimately hinders scientific progress.

Recent advances in large language models (LLMs) have demonstrated exceptional capabilities in code synthesis, multimodal reasoning, and tool utilization for 3D content creation [17][18][19][20]. These models demon-strate remarkable proficiency in generating correct object attributes, parsing textual descriptions, correctly interpreting code functions, and facilitating efficient human-computer interactions through natural language dialog. Despite these advances, existing LLM-driven geometric modeling systems primarily focus on general objects, such as CAD models or architectural applications, without addressing the unique challenges of botanical morphology, including hierarchical organ structures, species-specific features, and growth constraints, phyllotactic arrangements, and the need for biologically interpretable parameterizations that directly map to phenotypic traits. Moreover, these systems typically generate assets for visualization rather than scientific analysis, usually lacking the mathematical rigor required for quantitative phenotyping and computational simulation. A critical gap lies at the intersection of accessibility and analytical rigor. Plant scientists need tools that are as accessible as learning-based reconstruction methods yet provide the parametric control and mathematical continuity of procedural modeling, without requiring programming expertise or species-specific training data. Bridging this gap requires three synergistic capabilities: (1) automated generation of procedural templates incorporating botanical domain knowledge, (2) mathematically continuous parametric geometric representations suitable for both quantitative analysis and realistic visualization, and (3) human-readable high-level biologically meaningful parameter interfaces enabling direct editing by domain experts.

We present FloraForge (see Figure 1), an LLM-assisted framework that bridges this gap through three synergistic innovations. First, we leverage LLMs to automate the creation of procedural modeling templates from botanical descriptions, eliminating the need for manual geometric programming. Through structured iterative dialog, domain experts progressively refine Python scripts that implement continuous hierarchical B-spline surface representations with botanical constraints, without requiring expertise in geometric modeling. Second, we represent plant organs as continuous B-spline surfaces that provide both the mathematical properties required for quantitative analysis (smoothness, differentiability, and exact curvature computation) and the realism needed for visualization. The Non-Uniform Rational B-spline (NURBS) representations ensure compatibility with a variety of analysis pipelines1 -radiation simulations, computational fluid dynamics, and fini

📸 Image Gallery

ALL3_mungbean_mesh.png ALL3_mungbean_org.png ALL5_org.png ALL_mesh.png Baseline_proc_comparison.png cd_instance_vs_template.png mungbean_cd_manual_instance.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut