Automated identification of Ichneumonoidea wasps via YOLO-based deep learning: Integrating HiresCam for Explainable AI
Accurate taxonomic identification of parasitoid wasps within the superfamily Ichneumonoidea is essential for biodiversity assessment, ecological monitoring, and biological control programs. However, morphological similarity, small body size, and fine…
Authors: Joao Manoel Herrera Pinheiro, Gabriela Do Nascimento Herrera, Alvaro Doria Dos Santos
Aut omated identification of Ic hneumonoidea w asps via Y OLO-based deep lear ning: Integrating HiresCam f or Explainable AI ⋆ João Manoel Her rera Pinheiro a , Gabriela do Nascimento Her rera b , Al var o Doria dos Santos c , Luciana Bueno dos Reis Fer nandes b , Ricardo V . Godo y a , Eduardo A. B. Almeida d , Helena Carolina Onody e , Marcelo Andrade da Costa V ieira a , Angélica Mar ia Penteado-Dias b and Marcelo Becker a , ∗ a São Carlos Sc hool of Engineering, Univ ersity of São P aulo, São Carlos, 13566590, São P aulo, Brazil b Department of Ecology and Evolutionary Biology, F ederal Univ ersity of São Car los, São Carlos, 13565905, São P aulo, Brazil c F ederal U niver sity of T ocantins, P or to Nacional, 77500000, Br azil d Department of Biology, Univ ersity of São P aulo, Ribeirão Pr eto, 14040901, Brazil e State U niver sity of Piauí, Deputado Jesualdo Cavalcanti Campus, Corr ente, 64980000, Br azil A R T I C L E I N F O Keyw ords : Arthropod Biodiversity Conv olutional neural network Computer vision Entomology Hymenoptera T ax onomic identification XAI A B S T R A C T Accurate taxonomic identification of parasitoid wasps within the super f amily Ic hneumonoidea is essential for biodiversity assessment, ecological monitor ing, and biological control programs. How - ev er , mor phological similar ity , small body size, and fine-g rained interspecific variation make manual identification labor-intensive and expertise-dependent. This study proposes a deep lear ning–based framew ork f or the automated identification of Ic hneumonoidea wasps using a Y OLO-based arc hi- tecture integrated with High-Resolution Class Activ ation Mapping (HiResC AM) to enhance inter - pretability . The proposed system simultaneously identifies wasp families from high-resolution images. The dataset compr ises 3,556 high-resolution images of Hymenoptera specimens. The taxonomic distribution is pr imarily concentrated among the f amilies Ichneumonidae ( 𝑛 = 786 ), Braconidae ( 𝑛 = 648 ), Apidae ( 𝑛 = 466 ), and V espidae ( 𝑛 = 460 ). Extensiv e experiments w ere conducted using a curated dataset, with model perf ormance ev aluated t hrough precision, recall, F1-score, and accuracy . The results demonstrate high accuracy of o ver 96% and robust g eneralization across morphological variations. HiResCAM visualizations confir m that the model focuses on taxonomically relev ant anatomical regions, such as wing venation, antennae segmentation, and metasomal str uctures, thereby validating the biological plausibility of the lear ned features. The integ ration of explainable AI techniques impro ves transparency and tr ustw orthiness, making the system suitable for entomological research to accelerate biodiversity characterization in an under -described parasitoid super f amily . 1. Introduction Imagine inhabiting a world in whic h more than 80% of species remain entirel y unknown to science. This is t he cur- rent state of our know ledge reg arding Class Insecta ( Mora et al. , 2011 ; Stork , 2018 ). Although insects represent the most species-rich group of animals and account for ov er half of all descr ibed species ( Ma y , 1986 ; Resh and Cardé , 2009 ), our in v entory of this diversity is still f ar from complete. Appro ximately one million species ha ve been formally de- scribed, and scientists estimate that an additional 5.5 million species remain undiscov ered and undescr ibed ( Stork , 2018 ; Eggleton , 2020 ). W e are cur rently facing a significant gap in insect taxonom y ( Slade and Ong , 2023 ; Ong et al. , 2025 ), a ⋆ Source code: https://github.com/joaomh/identification- of- Ichneumonoidea- waps- YOLO- 2026 Dataset: https://zenodo.org/records/18501018 ∗ Corresponding aut hor joao.manoel.pinheiro@usp.br (J.M.H. Pinheiro); becker@sc.usp.br (M. Beck er) OR CID (s): 0009-0001-6192-7374 (J.M.H. Pinheiro); 0009-0000-0371-3012 (G.d.N. Her rera); 0000-0002-7997-4195 (A.D.d. Santos); 0009-0008-9329-4509 (L.B.d.R. Fernandes); 0000-0002-5323-9299 (R.V . Godoy); 0000-0001-6017-6364 (E.A.B. Almeida); 0000-0003-3570-8183 (H.C. Onody); 0000-0002-6038-7740 (M.A.d.C. Vieira); 0000-0002-8371-5591 (A.M. Penteado-Dias); 0000-0002-7508-5817 (M. Beck er) problem exacerbated by the ongoing global decline in insect species ( W agner e t al. , 2021 ; Fenoglio et al. , 2021 ). This decline has direct impacts on human well-being ( Scho w alter et al. , 2018 ), as insects are a cor nerstone of global biodiv ersity ( Cardoso et al. , 2020 ) and perform crucial ecosystem functions. These functions include pol- lination ( Gabriel and T scharntke , 2007 ), maintaining the health of agr icultural ecosystems ( Jankielsohn , 2018 ), nat- ural pest control ( Pardo and Borg es , 2020 ), and decompo- sition ( Eggleton , 2020 ). Consequently , the accurate iden- tification of insect species is vital f or effectiv e biodiversity monitoring and ecological researc h. Further more, precise classification is essential to distinguish agr icultural pests from beneficial organisms. Contrar y to common perception, the vast ma jority of insects are not har mful to humans ( Allison et al. , 2023 ) The order Hymenoptera comprises ants, bees, and wasps and represents one of t he most species-r ich insect orders ( Forbes et al. , 2018 ). Members of this order play essential ecological roles, particularly as pollinators ( Barbizan Sühs et al. , 2009 ; Beggs et al. , 2011 ). Among Hymenoptera, the Ichneumonoidea superfamil y is one of the most di- verse in the Neotropics ( Quick e , 2015 ; Y u et al. , 2016 , 2012 ). These w asps pr imarily parasitize larvae and pupae of holometabolous insects, although some groups can para- sitize adult ar thropods and arachnid oothecae, contributing J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. Page 1 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI to t he maintenance of ecological balance ( Quick e , 2015 ). The Ichneumonoidea superfamil y compr ises two major fam- ilies, Ichneumonidae and Braconidae. The Ichneumonidae, commonly known as Darwin wasps ( Klopfstein et al. , 2019 ), is a hyper -diverse famil y of par- asitoid w asps, with o ver 25,000 described species across 37 subf amilies and 1,450 genera ( Y u et al. , 2012 ; Quic ke , 2015 ). Of these, 4,419 species hav e been described in the Neotr opical region, and 955 ha v e been recorded in Brazil. They are parasitoids of larvae and pupae of holome tabolous insects, such as Coleoptera, Lepidoptera, and Hymenoptera, as w ell as other ar thropods ( Gauld and Bolton , 1988 ; Han- son and Gauld , 1995 ). Ic hneumonidae hav e been compara- tivel y less utilized in biological control pr ograms, although their parasitoid behavior can effectiv el y regulate the abun- dance of other insects, including agr icultural pests ( Quic ke , 2015 ). T ax onomicall y , the g roup poses significant difficul- ties, as recognition of subfamilies is complex, par ticularly compared to that of the Braconidae. Identification is often restricted to f emales, as males frequentl y lack distinctiv e diagnostic features ( Butc her and Quic ke , 2023 ). The Braconidae constitutes the second most diverse f amily within t he Hymenoptera. This famil y includes ov er 21,000 descr ibed species across more than 1,100 genera, though these numbers represent onl y a fraction of their true global div ersity ( Y u et al. , 2012 ; Quick e , 2015 ; Chen and van Achterber g , 2019 ). Due to t heir prevalence as parasitoids of other insects ( Matt he ws , 1974 ), braconids pla y a piv otal role in ter restrial ecosystems and are e xtensivel y utilized as agents in biological control programs ( Sha w and Huddle- ston , 1991 ). T axonomicall y , the mos t reliable distinction is f ound in the wing venation: braconids almost in v ariably lack the second recur rent vein (2m-cu) in the fore wing, a v ein that is typically present in ichneumonids ( Quic ke , 2015 ). Tr aditionally , insect identification has been the domain of expert entomologists, relying heavil y on mor phological ex amination under microscopes, detailed dichotomous key s, and extensiv e reference collections ( Wipfler et al. , 2016 ). This classical approach, while foundational to our under- standing of insect diversity , is inherentl y labor-intensiv e, time-consuming, and demands highly specialized training and years of experience ( Magni et al. , 2023 ). Deep learning, a rapidly e volving field within artificial intelligence, utilizes computational models composed of multiple processing la y ers to learn abstract dat a represen- tations ( Goodf ellow et al. , 2016 ; Bishop and Bishop , 2023 ). This dis tinguishes deep lear ning from traditional st atistical prediction approaches ( Sarker , 2021 ). These me thods hav e significantly advanced v arious domains, including imag e classification, semantic segmentation, object detection, and speech recognition ( Shinde and Shah , 2018 ; Sharifani and Amini , 2023 ). The core principle inv olv es discov er ing intr i- cate structures in larg e datasets through the backpr opagation algorit hm, which dictates ho w a machine adjusts its internal parameters to compute representations across lay ers ( LeCun et al. , 2015 ; Zhao et al. , 2024 ). U nlike traditional machine learning t hat relies on carefully engineered feature extrac- tors, deep learning automatically discov ers t he necessary representations fr om ra w data ( O’Mahony et al. , 2020 ; Indolia et al. , 2018 ) and has sho wn promising results across sev eral application domains ( Alzubaidi et al. , 2021 ; Bhatt et al. , 2021 ). While deep learning has receiv ed significant attention in other domains, its application in in v ertebrate monitor ing and biodiv ersity research has been slo w to develop ( Chr istin et al. , 2019 ). How e ver , this has chang ed ov er the past decade, as deep lear ning has begun to re v olutionize the fields of entomology and ecology ( W einstein , 2018 ; Li et al. , 2021 ; Høy e et al. , 2021 ). Deep lear ning and computer vision offer potential solutions to the long-standing challenges of inef- ficient and labor intensiv e insect identification ( Ärje e t al. , 2020 ; De Cesaro Júnior and Rieder , 2020 ; T eixeira et al. , 2023 ; Gao et al. , 2024 ), monitor ing ( Bilik et al. , 2024 ), and pest de tection ( Wu et al. , 2019 ; Barbedo , 2020 ; Batz et al. , 2023 ; Passias et al. , 2024 ). Ho we v er , a critical gap remains in the literature regar ding the taxonomic complexity of the Ichneumonoidea super f amil y . In t he context of ev aluating automated identification sys- tems for hyper -diverse t axa, it is notable that some studies, such as the Div ersityScanner ( Wührl et al. , 2022 , 2024 ), ha v e detected and identified 14 f amilies f or robot handling with a precision of 91.4%. How e ver , in their ev aluation of the Ichneumonoidea super f amily , only 246 images were used to assess t he identification model’ s performance. Furt hermore, while the sys tem employ ed class activation maps to visualize the features the neural network pr ioritized dur ing identific a- tion, t hese heatmaps w ere pr imarily used f or internal model validation rather than being systematicall y compared against established mor phological k ey s. In the domain of parasitoid w asps, ( Shirali et al. , 2024 ) demonstrated t he efficacy of deep lear ning f or identifying the highly diverse and cryptic Diapr iidae f amily , using a dataset of 2,257 images, their study compared three architectures, with the BEiT v2 transf or mer model achie ving the highest accuracy of 96% for g enus-lev el identification and 97% f or sex determination, significantly outper f orming Y OLOv8 and ConvNeXt. In this study , we present a no v el deep learning framewor k specifically designed for the automated identification of the hyper -diverse Ichneumonoidea super f amily . By lev eraging transf er lear ning and benchmarking state-of-the-ar t archi- tectures, including Y OLOv12 and Y OLOv26. A cr itical component of this approach is the integration of Explain- able Artificial Intellig ence (XAI) tec hniques, specifically HiResC AM, which provides high-resolution visual inter - pretations of the model’ s internal decision-making process. These visualizations enable t he identification of mor pholog- ically relevant regions, such as wing v enation and metasomal structures, that align with traditional tax onomic criteria, thereby enhancing the transparency and biological plausi- bility of the predictions. The dataset and source code are publicly a vailable to ensure reproducibility and to suppor t furt her research in biodiv ersity informatics. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 2 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 1: Specimens w ere retrieved from DCBU collection ( P enteado-Dias and Fernandes , 2025 ). Imaged from ( Pinheiro et al. , 2026 ). 2. Materials and methods 2.1. Data collection The biological mater ial for t his study , focusing on the Ichneumonoidea, was sourced from the DCBU taxonomic collection at UFSCar . Following specimen retr ie val, mor- phological documentation was conducted using a Leica M205C stereomicroscope paired with a K5C digit al camera. The acquisition process was manag ed via LAS X softw are, while the final high-depth-of-field composites were gener - ated through digital image stacking in Helicon Focus. Fig- ure 1 illustrates the photographic workflo w from specimen collection to final image pr ocessing. The Dataset of Parasitoid W asps and Associated Hy- menoptera (D APWH) ( Her rera Pinheiro et al. , 2026 ) com- prises high-resolution images of Hymenoptera specimens, with a primar y f ocus on t he f amilies Ichneumonidae and Braconidae. The dat aset cont ains a total of 3,556 imag es, of which more t han 40% cor respond to Ichneumonoidea w asps, as detailed in Table 1 . Figure 2 show s some samples of these w asps. T able 1 Distribution of images p er family in D APWH. F amily Images Ichneumonidae 786 Braconidae 648 Apidae 466 V espidae 460 Megachilidae 298 Chrysididae 244 Andrenidae 244 P ompilidae 190 Bethylidae 94 Halictidae 75 Colletidae 51 T otal 3,556 2.2. Model arc hitecture For the automated identification of Ichneumonoidea, we selected the Y OLOv12 ( Tian et al. , 2025 ) and Y OLOv26 ( Sapko ta e t al. , 2026 ) architectures. These models represent the cur rent state-of-the-ar t in object detection and classifi- cation, offer ing peak per f ormance f or complex biological datasets. Their selection was essential for processing t he in- tricate morphological data found in t he D APWH dat aset, as experimental tests re vealed that the Y OLO frame w ork pro- vided the f astest training times among the e valuated archi- tectures ( Pinheiro and Bec ker , 2026 ). W e implemented the nano variants of both architectures, specifically yolo v12n-cls and yolo26n-cls. 2.3. Data splitting and imaging rescale For t he model dev elopment phase, the dat aset was par ti- tioned into three distinct subsets to ensure robust training and unbiased ev aluation. Following established methodological con v entions ( Rasc hka , 2020 ), we allocated 70% of the total images for the training se t, while the remaining 30% was divided equall y , with 15% dedicated to validation dur ing training and 15% reserved as an independent tes t hold- out set. This distribution ensures that the final perf ormance metrics represent the model’ s ability to generalize to unseen Ichneumonoidea specimens. The final partitioning of the dataset into training, validation, and tes t subsets is detailed in Table 2 T able 2 Dataset distribution by family after splitting. F amily T rain V al T est T otal Andrenidae 170 36 38 244 Apidae 326 69 71 466 Bethylidae 65 14 15 94 Braconidae 453 97 98 648 Chrysididae 170 36 38 244 Colletidae 35 7 9 51 Halictidae 52 11 12 75 Ichneumonidae 550 117 119 786 Megachilidae 208 44 46 298 P ompilidae 133 28 29 190 V espidae 322 69 69 460 T otal 2,484 528 544 3,556 Given t he high-fidelity nature of the or iginal stacked im- ages acq uired with the Leica M205C system, spatial do wn- sampling w as req uired to align with the neural ne twor k’ s computational constraints. All images were rescaled to a fixed input dimension of 512 × 512 pix els for Y OLO training. 2.4. T raining and ev aluation The training and ev aluation of the models were per- f ormed on a high-per f ormance works tation running Linux. The hardware configuration consisted of an AMD R yzen 9 7900 CPU , 64GB of DDR5 RAM, and an NVIDIA R TX 4090 GPU with 24GB of VRAM with CUD A 13.1. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 3 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI (a) (b) (c) (d) (e) (f ) (g) (h) (i) (j) (k) (l) Figure 2: Examples of samples in the DAPWH dataset ( Herrera Pinheiro et al. , 2026 ). (a)-(f ) Braconidae; (g)-(l) Ichneumonidae. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 4 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI T o quantify t he classification perf ormance of the dev el- oped models, we employ ed a suite of standard ev aluation metrics: Accuracy , Precision, Recall, and the F1-score, de- fined b y Eqs. 1 , 2 , 3 , and 4 , respectiv ely . These indicators are widely recognized as benchmar ks f or both image classifica- tion and object detection tasks ( Lin et al. , 2015 ). Accuracy = 𝑇 𝑃 + 𝑇 𝑁 𝑇 𝑃 + 𝑇 𝑁 + 𝐹 𝑃 + 𝐹 𝑁 , (1) Precision = 𝑇 𝑃 𝑇 𝑃 + 𝐹 𝑃 . (2) Recall = 𝑇 𝑃 𝑇 𝑃 + 𝐹 𝑁 . (3) 𝐹 1 = 2 ⋅ Precision ⋅ Recall Precision + Recall . (4) where 𝑇 𝑃 , 𝑇 𝑁 , 𝐹 𝑃 , and 𝐹 𝑁 represent tr ue positiv es, tr ue negativ es, false positives, and f alse negativ es, respectivel y . 2.5. Model interpretability While q uantitative metr ics such as accuracy , preci- sion, recall, and F1-score are statistical measures of perfor - mance, they do not rev eal the neural netw ork’ s decision- making process. To ensure t he t ax onomic validity of the model’ s predictions, w e employ ed Explainable AI (XAI) techniq ues, specifically High-Resolution Class Activation Mapping (HiRes-CAM) ( Draelos and Car in , 2021 ). HiR es- CAM computes element-wise impor tance scores to produce visualization maps that are str ictl y f ait hful to the model’ s computations. This higher spatial precision allo ws us to verify whether the model is focusing on relevant mor pho- logical diagnostic traits, such as specific wing venation patterns, rather than learning spur ious cor relations from the back ground. 2.6. Resear ch w orkflow Figure 3 show s t he ov erview of t he proposed explainable identification framewor k. High-resolution images of Ichneu- monidae specimens are provided as input to the Y OLOv26 model. The netw ork extracts hierarchical f eature represen- tations that capture discr iminativ e patterns across conv o- lutional la yers. Finall y , HiR esCAM is applied to generate class-discriminative activation maps that highlight biolog- ically relevant regions, such as wing venation and thoracic structures, to suppor t transparent and inter pretable predic- tions. 3. Results and discussion 3.1. Model performance The performance ev aluation conducted on the D APWH test set indicates that both arc hitectures achie ve high lev els of t ax onomic discr imination f or the Ichneumonoidea super - f amily . As summarized in Table 3 , the Y OLOv26 model demonstrated super ior per f ormance across all evaluated metrics. Specifically , Y OLOv26 ac hiev ed a Top-1 Accuracy of 96.14%, representing a significant improv ement o v er the 94.85% attained b y the Y OLOv12 variant. Regarding the model’ s reliability in identifying comple x mor phological f eatures, Y OLOv26 reac hed a Precision of 93.43% and a robust Recall of 97.04%. The resulting 𝐹 1 -score of 95.20% furt her confir ms the model’ s effectiveness in balancing f alse positives and negatives. T able 3 P erformance Compa rison of YOLO Classification Mo dels on the DAPWH T est Set. Mo del A ccuracy Precision Recall 𝐹 1 YOLO V12 0.9485 0.9132 0.9429 0.9278 YOLO V26 0.9614 0.9343 0.9704 0.9520 Both models e xhibited stable con ver gence over the 150 training epochs, with a rapid reduction in training loss dur ing the initial iterations f ollow ed b y gradual stabilization. For Y OLOv12, the training loss decreased shar ply within the first 20–30 epoc hs and asym ptoticall y approached near-zero values, while the v alidation loss stabilized around 0.20 after early fluctuations. The T op-1 accuracy increased con- sistentl y , sur passing 0.95 in later epochs, whereas Top-5 ac- curacy rapidly saturated, remaining close to 1.00 throughout most of the training process. These trends indicate efficient f eature lear ning and strong generalization capacity without evident signs of ov er fitting. Similarl y , Y OLOv26 (Fig. 4b) demonstrated fast conv er- gence and improv ed stability during validation. The valida- tion loss exhibited slightly low er variance than Y OLOv12 and conv erg ed to marginall y lower values. T op-1 accu- racy steadily impro ved to approximatel y 0.97 in the fi- nal epochs, indicating robust ranking per f or mance. Over - all, both architectures achiev ed high classification accuracy; how ever , Y OLOv26 presented smoother validation behavior and slightly super ior g eneralization per f ormance. The nor malized confusion matr ices (Fig. 6 ) demonstrate strong class-level discr imination f or both models, with dom- inant diagonal v alues indicating high per-f amil y accuracy . For Y OLOv12, most f amilies achiev ed cor rect identification rates abov e 93%, including Bethylidae, V espidae, Megachil- idae, Apidae, Chr ysididae, and Ichneumonidae. Moderate confusion was observed f or Colle tidae and Halictidae, sug- gesting greater mor phological similarity or class imbalance effects. Limited cr oss-f amily misclassification occurred pr i- marily between tax onomically related groups, such as An- drenidae and Halictidae, and be tween Megac hilidae and Colletidae. Y OLOv26 show ed impro v ed ov erall discr imination across Apidae, Bethy lidae, Halictidae, and V espidae. Ichneumonidae achie v ed 97% accuracy , while Braconidae and Chrysididae remained abov e 94%. Although Colletidae remained less separable, cross-class confusion w as generally lower than with Y OLOv12. The concentration of high diagonal v alues J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 5 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 3: Research w orkflo w. Figure 4: T raining and validation p erfo rmance of YOLOv12 mo del over 150 epo chs. Figure 5: T raining and validation p erfo rmance of YOLOv26 mo del over 150 epo chs. and the reduction of off-diagonal er rors indicate enhanced generalization and more consistent inter -f amily boundar y learning in Y OLOv26. 3.2. Model interpretability The qualitativ e analysis of the learned representations and attention maps provides fur ther insight into the model’ s internal decision-making process. The featur e activation maps extracted from intermediate con volutional lay ers re- veal t hat the netw ork progressivel y encodes discriminative morphological patterns, emphasizing str uctural contours while suppressing back ground inf ormation. The diversity of activation responses across channels indicates hierarchical f eature abstraction, ranging from low -lev el edge detection to higher -lev el morphological descriptors. In Fig. 7 , the visualizations demonstrate that the model emphasizes critical structural contours, such as the wing segmentation for the famil y Ic hneumonidae. For the famil y Braconidae, the visualization of inter mediate con v olutional la yers demonstrates that the model also effectiv el y sup- presses background noise. As sho wn in Fig. 8 , the hierar- chical encoding process prior itizes diagnosticall y rele vant anatomical regions, such as the metasomal segmentation. By capturing these multi-scale f eatures, the conv olutional la yers enable the model to achie ve high per -f amily accuracies of 97% f or Ichneumonidae and 94% f or Braconidae, as shown in t he nor malized confusion matrix (Fig. 6 ). 3.3. Ichneumonidae For the identification of Ichneumonidae, tw o situations were obser v ed. In the first, t he model lik ely relied on tra- ditional mor phological characters used to distinguish t he f amily , particularly those of the fore wing (Fig. 9 and Fig. 10 ). For example, the presence of the f ore wing vein 2m-cu is a cr ucial character for identifying Ichneumonidae. This cor - responds to step 2 in t he ke y f or the separation of British and Irish Braconidae and Ichneumonidae (Broad et al., 2018). Another important wing character is the absence of v ein RS+M f or ming the discosubmarginal cell, which is used in step 3 of the same ke y . Additionally , facial features were also captured, suc h as t he conv ex face typical of Ichneumonidae (Fig. 11 ). In the second situation, ho we v er , the model appeared to rely on non-traditional diagnostic character istics (Fig. 12 ). Instead of focusing on e xplicit str uctural f eatures com- monly used in taxonom y , the model based its decisions on broader mor phological patter ns or o verall visual similarity . J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 6 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI (a) (b) Figure 6: Confusion matrix no rmalized. (a) YOLOv12; (b) YOLOv26. J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 7 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI (a) (b) Figure 7: Rep resentative feature maps samples extracted from intermediate convolutional la yers for YOLOv26, illustrating the hiera rchical encoding of mo rphological structures and texture patterns fo r Ichneumonidae. (a) Habitus lateral; (b) Head frontal J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 8 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI (a) (b) Figure 8: Rep resentative feature maps samples extracted from intermediate convolutional la yers for YOLOv26, illustrating the hiera rchical encoding of mo rphological structures and texture patterns fo r Braconidae. (a) Habitus lateral; (b) Head frontal J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 9 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 9: HiResCAM visualizations for Ichneumonidae. The heatmaps demonstrate that the YOLOv26 a rchitecture p rio ri- tizes wing venation patterns, notably the discosubmarginal cell, aligning with established entomological k eys. Figure 10: HiResCAM visualizations fo r Ichneumonidae. The heatmaps demonstrate that the YOLOv26 a rchitecture p rio ri- tizes wing venation patterns, particula rly the second recurrent vein (2m-cu), aligning with established entomological keys. Figure 11: HiResCAM visualizations fo r Ichneumonidae, the heatmaps demonstrate that the YOLOv26 a rchitecture p ri- o ritizes convex facial aligning with established entomological k eys. This beha vior highlights an opportunity to e xplore non- con v entional or underemphasized characters that, while no t f ormally incor porated into identification key s, could cont ain diagnosticall y significant inf ormation. 3.4. Braconidae The famil y Braconidae is characterized by sev eral dis- tinct morphological featur es successfull y captured by the Figure 12: HiResCAM visualizations for Ichneumonidae reveal that the YOLOv26 architecture identifies and p rioritizes al- ternative mo rphological structures beyond those traditionally emphasized in dichotomous taxonomic k eys. Figure 13: HiResCAM visualizations for Braconidae reveal that the YOLOv26 architecture identifies and prio ritizes the absence of areolet and 2m-cu aligning with established entomological k eys. Y OLOv26 arc hitecture. The most reliable taxonomic dis- tinction for this f amily is f ound in the wing v enation ( Athe y et al. , 2023 ). Unlike Ichneumonids, Braconids almost in vari- ably lac k the second recur rent vein (2m-cu) and the areolet in t he fore wing (Fig. 13 ). Furt hermore, an essential diagnostic trait is t he fusion of metasomal tergites 2 and 3, creating a r igid structural unit that is clearly visible in the lateral profiles of the specimens (Fig. 14 ). At the subfamil y lev el, specialized mandibular structures serve as additional ke y identifiers. In specific groups, the mandibles are characteristically open and non- ov erlapping ( Athey et al. , 2023 ; Butc her and Quic ke , 2023 ), as documented in t he frontal vie w provided in Fig. 15 3.5. Apidae For Apidae identification, the model’ s perf ormance w as ev aluated in recognizing body regions and structures that are traditional diagnostic f eatures crucial for the taxonomic J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 10 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 14: HiResCAM visualizations for Braconidae reveal that the YOLOv26 architecture identifies and prio ritizes the fused metasomal aligning with established entomological keys. Figure 15: HiResCAM visualizations for Braconidae reveal that the YOLOv26 architecture identifies and p rioritizes the nandibles open aligning with established entomological k eys. classification of various bee tribes. A key example is the vari- ation in wing venation; the model accuratel y differentiated betw een the full y de veloped, complex v enation typical of most apid groups (e.g., Fig. 16 ) and the significantly reduced or simplified patter ns characteristic of stingless bees (e.g., Fig. 17 ) ( Michener , 2007 ). Additionall y , the model c aptured import ant head features, par ticularl y the medial margin of the compound ey es and the relative proportions of t he differ - ent regions (e.g., Fig 18 ), which are diagnostic at the genus lev el. Another key area highlighted by the model w as t he hind leg, with a f ocus on specialized str uctures inv ol v ed in pollen transpor t (Fig. 20 ). The presence of either a scopa or a corbicula (Fig. 19 ) is a decisive trait for distinguishing tr ibes of Apidae ( Michener , 2007 ). 4. Conclusion This study presented a Y OLO-based deep lear ning frame- w ork for automated identification of Ichneumonoidea wasps. The proposed sy stem ac hiev ed strong classification perfor - mance while maintaining computational efficiency suitable f or real-time or near real-time applications. The incor poration of Grad-CAM provided cr itical in- sight into the model’ s decision-making process. V isualiza- tion results indicate that the netw ork consistentl y attends to biologically meaningful morphological f eatures, including Figure 16: HiResCAM visualizations for Apidae reveal that the YOLOv26 a rchitecture identifies and prio ritizes the fully develop ed complex venation. Figure 17: HiResCAM visualizations for Apidae reveal that the YOLOv26 a rchitecture identifies and prio ritizes the fully develop ed complex venation. Figure 18: HiResCAM visualizations for Apidae reveal that the YOLOv26 architecture identifies and p rioritizes the medial ma rgin of the comp ound eyes. wing v enation patter ns, antennal morphology , and metaso- mal segmentation. This alignment between lear ned repre- sentations and taxonomic traits enhances model credibility and suppor ts its applicability in scientific w orkflow s. From a practical standpoint, the frame work reduces de- pendency on exper t taxonomists f or routine identification tasks and offers scalable suppor t f or biodiv ersity sur v ey s, J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 11 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI Figure 19: HiResCAM visualizations fo r Apidae reveal that the YOLOv26 architecture identifies and prio ritizes the p resence of either a scopa or a co rbicula. Figure 20: HiResCAM visualizations for Apidae reveal that the YOLOv26 architecture identifies and prio ritizes p resence of either a scopa o r a co rbicula. ecological research, and biological control initiatives. More- ov er, the explainability component addresses a major limi- tation of black -box deep lear ning sy stems by enabling qual- itative validation of predictions. Future work ma y e xtend this frame wor k to subfamil y or genus identification and incorporate larger , more diverse datasets to improv e generalization. Integration into mobile or field-deploy able systems w ould fur ther increase accessi- bility and real-w orld utility . Over all, the study demonstrates that combining modern object detection models with explainable AI techniques pro- vides a robust and inter pretable solution f or automated insect taxonom y . CRedi T authorship contribution statement João Manoel Herrera Pinheiro: Writing, or iginal draft, methodology , dev eloped t he dataset, formal analy sis, sof t- w are. Gabriela do Nascimento Herrera: W r iting, or iginal draft, me thodology , de veloped the dataset, f ormal anal ysis, revie w the taxonom y of the insects. Alvar o Doria dos Santos: Pro vide images of Ichneumonoidea and V espidae, writing, re view and editing. Luciana Bueno dos R eis F er - nandes: Provide images of Ichneumonoidea, wr iting, review and editing. Ricar do V . Godoy: Writing, formal analy sis, revie w and editing. Eduardo A. B. Almeida: Pro vide im- ages of Apidae, Andrenidae and Halictidae, writing, revie w and editing. Helena Carolina Onody: Pr ovide images of Ichneumonoidea and V espidae, writing, re view and edit- ing. Marcelo Andrade da Costa Vieira: Writing, f ormal analy sis, supervision, review and editing. Angélica Maria Pent eado-Dias: Pro vide images of Ichneumonoidea, fund- ing acquisition, dat a curation, writing, or iginal draft, revie w , editing and super vision. Marcelo Beck er: Writing, or iginal draft, data curation, funding acquisition, review , editing and supervision. Declaration of compe ting interest The aut hors declare that they hav e no known competing financial interests or personal relationships that could ha ve appeared to influence the w ork repor ted in this paper . Data a vailability The dat aset is a v ailable at Zenodo ( Her rera Pinheiro et al. , 2026 ) and the source code used is av ailable in the GitHub repositor y . A ckno wledgment This work was suppor ted by Fundação de Apoio à Física e à Química (FAFQ), Coordenação de Aperfeiçoamento de Pessoal de Nível Super ior (CAPES) g rant n º 88887.002221/2024- 00 and n º 88887.975224/2024–00, Fundação de Amparo à Pesq uisa do Estado de São Paulo (FAPESP) g rant n º 2014/50940- 2, 2019/09215-6 and 2022/11451-2, Conselho Nacional de Desen v olviment o Científico e Tecnológico (CNPq) grant n º 465562/2014-0, Instituto Nacional de Ciência e T ecnolo- gia dos Hymenoptera Parasitoides (IN CT -HYMP AR). Ref erences Allison, J.D., Paine, T.D., Slippers, B., Wingfield, M.J. (Eds.), 2023. Forest Entomology and Pathology . 1 ed., Spr inger Cham. URL: https://doi. org/10.1007/978- 3- 031- 11553- 0 , doi: 10.1007/978- 3- 031- 11553- 0 . Alzubaidi, L., Zhang, J., Humaidi, A.J., Al-Dujaili, A., Duan, Y ., Al- Shamma, O., Sant amaría, J., Fadhel, M.A., Al-Amidie, M., Farhan, L., 2021. Re view of deep lear ning: concepts, cnn architectures, challenges, applications, future directions. Jour nal of Big Data 8, 53. URL: https:// doi.org/10.1186/s40537- 021- 00444- 8 , doi: 10.1186/s40537- 021- 00444- 8 . Athey , K., Fernandez- Triana, J., Penteado-Dias, A., Quic ke, D., Shar ke y , M., 2023. 2023 k ey to the new world subfamilies of the family bra- conidae (h ymenoptera). Canadian Journal of Ar thropod Identification doi: 10.3752/cjai.2023.49 . Barbedo, J.G.A., 2020. Detecting and classifying pests in crops using proximal images and machine learning: A revie w . AI 1, 312–328. URL: https://www.mdpi.com/2673- 2688/1/2/21 , doi: 10.3390/ai1020021 . Barbizan Sühs, R., Somavilla, A., Köhler, A., Putzke, J., 2009. V espídeos (hymenoptera, v espidae) vetores de pólen de schinus terebinthifolius raddi (anacardiaceae), santa cruz do sul, rs, brasil. Brazilian Jour nal of Biosciences 7, 138–143. Batz, P ., Will, T ., Thiel, S., Ziesche, T.M., Joachim, C., 2023. From identifi- cation to forecasting: the potential of image recognition and ar tificial in- telligence for aphid pest monitoring. Frontiers in Plant Science V olume 14 - 2023. URL: https://www.frontiersin.org/journals/plant- science/ articles/10.3389/fpls.2023.1150748 , doi: 10.3389/fpls.2023.1150748 . Beggs, J.R., Brockerhoff, E.G., Corley , J.C., Kenis, M., Masciocchi, M., Muller, F., Rome, Q., Villemant, C., 2011. Ecological effects J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 12 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI and manag ement of inv asiv e alien vespidae. BioControl 56, 505– 526. URL: https://doi.org/10.1007/s10526- 011- 9389- z , doi: 10.1007/ s10526- 011- 9389- z . Bhatt, D., Patel, C., Talsania, H., Patel, J., V aghela, R., Pandya, S., Modi, K., Ghayv at, H., 2021. Cnn variants for computer vision: His tory , architecture, application, challeng es and future scope. Electronics 10. URL: https://www.mdpi.com/2079- 9292/10/20/2470 , doi: 10.3390/ electronics10202470 . Bilik, S., Zemcik, T ., Kratochvila, L., Ricanek, D., Ric hter, M., Zambanini, S., Horak, K., 2024. Machine learning and com- puter vision techniques in continuous beehive monitoring applica- tions: A sur ve y . Computers and Electronics in Agr iculture 217, 108560. URL: https://www.sciencedirect.com/science/article/pii/ S0168169923009481 , doi: https://doi.org/10.1016/j.compag.2023.108560 . Bishop, C., Bishop, H., 2023. Deep Learning: Foundations and Concepts. Springer International Publishing. Butcher , B., Quicke, D., 2023. The Par asitoid W asps of Sout h East Asia. CAB Inter national. Cardoso, P ., Barton, P .S., Birkhof er, K., Chichorro, F ., Deacon, C., Fart- mann, T ., Fukushima, C.S., Gaigher , R., Habel, J.C., Hallmann, C.A., Hill, M.J., Hochkirch, A., Kwak, M.L., Mammola, S., Ar i Noriega, J., Or finger , A.B., Pedraza, F ., Pr yke, J.S., R oque, F .O., Se ttele, J., Simaika, J.P ., Stork, N.E., Suhling, F ., V orster, C., Samw ay s, M.J., 2020. Scientists ’ warning to humanity on insect extinctions. Biological Con- servation 242, 108426. URL: https://www.sciencedirect.com/science/ article/pii/S0006320719317823 , doi: https://doi.org/10.1016/j.biocon. 2020.108426 . Chen, X.x., v an Ac hterberg, C., 2019. Systematics, phy logen y, and evo- lution of braconid wasps: 30 years of progress. Annual Review of En- tomology 64, 335–358. URL: https://www.annualreviews.org/content/ journals/10.1146/annurev- ento- 011118- 111856 , doi: https://doi.org/10. 1146/annurev- ento- 011118- 111856 . Christin, S., Hervet, E., Lecomte, N., 2019. Applications for deep lear ning in ecology . Methods in Ecology and Ev olution 10, 1632–1644. URL: https://besjournals.onlinelibrary.wiley.com/doi/abs/10. 1111/2041- 210X.13256 , doi: https://doi.org/10.1111/2041- 210X.13256 , arXiv:https://besjournals.onlinelibrary.wiley.com/doi/pdf/10.1111/2041-210X.13256 . De Cesaro Júnior, T., Rieder, R., 2020. Automatic identification of insects from digital images: A sur ve y . Computers and Electronics in Agri- culture 178, 105784. URL: https://www.sciencedirect.com/science/ article/pii/S0168169920311224 , doi: https://doi.org/10.1016/j.compag. 2020.105784 . Draelos, R.L., Carin, L., 2021. Use hirescam instead of grad-cam for faithful explanations of conv olutional neural networks. URL: https: //arxiv.org/abs/2011.08891 , arXiv:2011.08891 . Eggleton, P ., 2020. The state of the wor ld’ s insects. Annual Re view of Environment and Resources 45, 61–82. URL: https://www.annualreviews.org/content/journals/10.1146/ annurev- environ- 012420- 050035 , doi: https://doi.org/10.1146/ annurev- environ- 012420- 050035 . Fenoglio, M.S., Calviño, A., González, E., Salvo, A., Videla, M., 2021. Urbanisation drivers and underlying mechanisms of terrestrial insect div ersity loss in cities. Ecological Entomology 46, 757– 771. URL: https://resjournals.onlinelibrary.wiley.com/doi/ abs/10.1111/een.13041 , doi: https://doi.org/10.1111/een.13041 , arXiv:https://resjournals.onlinelibrary.wiley.com/doi/pdf/10.1111/een.13041 . Forbes, A.A., Bagley , R.K., Beer, M.A., Hippee, A.C., Widmay er, H.A., 2018. Quantifying the unquantifiable: why hymenoptera, not coleoptera, is t he most speciose animal order . BMC Ecology 18, 21. URL: https: //doi.org/10.1186/s12898- 018- 0176- x , doi: 10.1186/s12898- 018- 0176- x . Gabriel, D., T scharntke, T ., 2007. Insect pollinated plants benefit from organic farming. Agriculture, Ecosy stems & Environment 118, 43–48. URL: https://www.sciencedirect.com/science/article/pii/ S0167880906001484 , doi: https://doi.org/10.1016/j.agee.2006.04.005 . Gao, Y ., Xue, X., Qin, G., Li, K., Liu, J., Zhang, Y ., Li, X., 2024. Application of machine learning in automatic image iden- tification of insects - a review . Ecological Informatics 80, 102539. URL: https://www.sciencedirect.com/science/article/pii/ S1574954124000815 , doi: https://doi.org/10.1016/j.ecoinf.2024.102539 . Gauld, I., Bolton, B., 1988. The Hymenoptera. Br itish Museum (Natural History). Goodf ellow , I., Bengio, Y ., Courville, A., 2016. Deep Lear ning. Adaptiv e Computation and Machine Learning series, MIT Press. Hanson, P .E., Gauld, I.D., 1995. The Hymenoptera of Costa Rica: The Natural Histor y Museum, London. Oxford U niv ersity Press. URL: https://doi.org/10.1093/oso/9780198549055.001.0001 , doi: 10.1093/oso/ 9780198549055.001.0001 . Herrera Pinheiro, J.M., do Nascimento Herrera, G., Bueno dos Reis Fernan- des, L., Doria dos Santos, A., Vilela de Godoy , R., Andrade Botelho de Almeida, E., Carolina Onody, H., Andrade da Cost a Vieira, M., Maria Penteado-Dias, A., Bec ker , M., 2026. Dataset of parasitoid wasps and associated hymenoptera (dapwh). URL: https://doi.org/10.5281/ zenodo.18501018 , doi: 10.5281/zenodo.18501018 . Høy e, T .T ., Ärje, J., Bjerge, K., Hansen, O.L.P ., Iosifidis, A., Leese, F ., Mann, H.M.R., Meissner, K., Melv ad, C., Raitoharju, J., 2021. Deep lear ning and computer vision will transform entomology . Proceedings of the National Academ y of Sciences 118, e2002545117. URL: https://www.pnas.org/ doi/abs/10.1073/pnas.2002545117 , doi: 10.1073/pnas.2002545117 , arXiv:https://www.pnas.org/doi/pdf/10.1073/pnas.2002545117 . Indolia, S., Goswami, A.K., Mishra, S., Asopa, P ., 2018. Conceptual understanding of con volutional neural network - a deep learning approach. Procedia Computer Science 132, 679–688. URL: https: //www.sciencedirect.com/science/article/pii/S1877050918308019 , doi: https://doi.org/10.1016/j.procs.2018.05.069 . international Conf erence on Computational Intelligence and Data Science. Jankielsohn, A., 2018. The importance of insects in ag ricultural ecosys- tems. Advances in Entomology 06, 62–73. doi: 10.4236/ae.2018.62006 . Klopfstein, S., Santos, B.F ., Shaw , M.R., Alvarado, M., Bennett, A.M., Dal Pos, D., Giannotta, M., Herrera Florez, A.F ., Karls- son, D., Khalaim, A.I., et al., 2019. Darwin wasps: a new name heralds renewed effor ts to unrav el the evolutionary history of ichneumonidae. Entomological Communications 1, ec01006. URL: https://www.entomologicalcommunications.org/index.php/entcom/ article/view/ec01006 , doi: 10.37486/2675- 1305.ec01006 . LeCun, Y ., Bengio, Y ., Hinton, G., 2015. Deep learning. Nature 521, 436–444. URL: https://doi.org/10.1038/nature14539 , doi: 10.1038/ nature14539 . Li, W ., Zheng, T ., Y ang, Z., Li, M., Sun, C., Y ang, X., 2021. Classification and detection of insects from field images using deep lear ning for smar t pest management: A systematic review . Ecological Informatics 66, 101460. URL: https://www.sciencedirect.com/science/article/pii/ S157495412100251X , doi: https://doi.org/10.1016/j.ecoinf.2021.101460 . Lin, T .Y ., Maire, M., Belongie, S., Bourdev , L., Girshick, R., Hays, J., Perona, P ., Ramanan, D., Zitnick, C.L., Dollár, P ., 2015. Microsof t coco: Common objects in conte xt. URL: , arXiv:1405.0312 . Magni, P .A., Harvey , A.D., Guareschi, E.E., 2023. Insects associated wit h ancient human remains: How archaeoentomology can provide additional inf ormation in archaeological studies. Her itage 6, 435–465. URL: https://www.mdpi.com/2571- 9408/6/1/23 , doi: 10.3390/heritage6010023 . Matthew s, R., 1974. Biology of braconidae. Annual Revie w of Entomology 19, 15–32. doi: 10.1146/annurev.en.19.010174.000311 . May , R.M., 1986. Biological diversity: How many species are t here? Nature 324, 514–515. URL: https://doi.org/10.1038/324514a0 , doi: 10.1038/ 324514a0 . Michener , C., 2007. The Bees of t he W orld. Johns Hopkins University Press. Mora, C., Tittensor, D.P ., Adl, S., Simpson, A.G.B., W orm, B., 2011. How many species are there on earth and in the ocean? PLOS Biology 9, 1–8. URL: https://doi.org/10.1371/journal.pbio.1001127 , doi: 10.1371/ journal.pbio.1001127 . O’Mahony , N., Campbell, S., Carvalho, A., Harapanahalli, S., Her nandez, G.V ., Krpalko va, L., Riordan, D., W alsh, J., 2020. Deep learning vs. traditional computer vision, in: Arai, K., Kapoor, S. (Eds.), A dvances J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 13 of 14 Automated identification of Ichneumonoidea w asps via YOLO-based deep learning: Integrating HiresCam for Explainable AI in Computer Vision, Springer International Publishing, Cham. pp. 128– 144. Ong, X.R., T an, B., Chang, C.H., Puniamoort hy , N., Slade, E.M., 2025. Identifying the know ledge and capacity gaps in southeast asian insect conservation. Ecology Letters 28, e70038. URL: https://onlinelibrary.wiley.com/doi/ abs/10.1111/ele.70038 , doi: https://doi.org/10.1111/ele.70038 , arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/ele.70038 . e70038 ELE-00548-2024.R4. Pardo, A., Borges, P .A., 2020. W orldwide impor tance of insect pollination in apple orchards: A revie w . A griculture, Ecosystems & Environment 293, 106839. URL: https: //www.sciencedirect.com/science/article/pii/S0167880920300244 , doi: https://doi.org/10.1016/j.agee.2020.106839 . Passias, A., T sakalos, K.A., Rigogiannis, N., V oglitsis, D., Papanikolaou, N., Michalopoulou, M., Brouf as, G., Sirakoulis, G.C., 2024. Insect pest trap development and dl-based pest detection: A comprehensiv e review . IEEE Transactions on AgriFood Electronics 2, 323–334. doi: 10.1109/ TAFE.2024.3436470 . Penteado-Dias, A.M., Fernandes, L.B.d.R., 2025. Dcbu - coleção t ax- onômica do departamento de ecologia e biologia evolutiv a da ufscar . URL: https://doi.org/10.15468/xzkz3y , doi: 10.15468/xzkz3y . Pinheiro, J.M.H., Becker , M., 2026. Deep lear ning-based computer vision techniques for automated identification of Ic hneumonoidea and other Hymenoptera insects. Master’ s thesis. U niversidade de São Paulo. doi: 10.11606/D.18.2026.tde- 09022026- 143242 . Pinheiro, J.M.H., Her rera, G.D.N., Fer nandes, L.B.D.R., Santos, A.D.D., Godoy , R.V ., Almeida, E.A.B., Onody , H.C., Vieira, M.A.D.C., Penteado-Dias, A.M., Becker, M., 2026. Descriptor: Dataset of para- sitoid wasps and associated hymenoptera (dapwh). URL: https://arxiv. org/abs/2602.20028 , arXiv:2602.20028 . Quick e, D., 2015. The Braconid and Ic hneumonid Parasitoid W asps: Biology , Systematics, Evolution and Ecology . Wile y . Raschka, S., 2020. Model e v aluation, model selection, and algorit hm selection in machine learning. URL: , arXiv:1811.12808 . Resh, V ., Cardé, R., 2009. Encyclopedia of Insects. Encyclopedia of Insects, Academic Press. Sapkota, R., Cheppally , R.H., Sharda, A., Karkee, M., 2026. Y olo26: Ke y arc hitectural enhancements and per f ormance benchmarking f or real-time object detection. URL: , arXiv:2509.25164 . Sarker , I.H., 2021. Deep learning: A comprehensive overview on tec h- niques, taxonomy , applications and resear ch directions. SN Computer Science 2, 420. URL: https://doi.org/10.1007/s42979- 021- 00815- 1 , doi: 10.1007/s42979- 021- 00815- 1 . Scho w alter, T., Noriega, J., Tsc harntke, T ., 2018. Insect effects on ecosystem ser vices—introduction. Basic and Applied Ecology 26, 1–7. URL: https://www.sciencedirect.com/science/article/pii/ S1439179117302207 , doi: https://doi.org/10.1016/j.baae.2017.09.011 . insect Effects on Ecosystem ser vices. Sharifani, K., Amini, M., 2023. Machine learning and deep learning: A revie w of methods and applications. W orld Inf ormation Tec hnology and Engineering Jour nal 10, 3897–3904. URL: https://ssrn.com/abstract= 4458723 . a vailable at SSRN: https://ssr n.com/abstract=4458723. Sha w , M., Huddleston, T ., 1991. Classification and Biology of Braconid W asps (Hymenoptera: Braconidae). Handbooks for the identification of British insects, Roy al Entomological Society of London. Shinde, P .P ., Shah, S., 2018. A review of machine lear ning and deep learning applications, in: 2018 Fourt h Inter national Conference on Com- puting Communication Control and Automation (ICCUBEA), pp. 1–6. doi: 10.1109/ICCUBEA.2018.8697857 . Shirali, H., Hübner, J., Both, R., Raupach, M., R eischl, M., Schmidt, S., Pylatiuk, C., 2024. Imag e-based recognition of parasitoid wasps using advanced neural netw orks. Invertebrate Systematics 38. URL: https: //doi.org/10.1071/IS24011 , doi: 10.1071/IS24011 . Slade, E.M., Ong, X.R., 2023. The future of tropical insect diversity : strategies to fill dat a and know ledge gaps. Cur rent Opinion in Insect Science 58, 101063. URL: https: //www.sciencedirect.com/science/article/pii/S2214574523000603 , doi: https://doi.org/10.1016/j.cois.2023.101063 . Stork, N.E., 2018. How many species of insects and other terres- trial art hropods are there on ear th? Annual Review of Ento- mology 63, 31–45. URL: https://www.annualreviews.org/content/ journals/10.1146/annurev- ento- 020117- 043348 , doi: https://doi.org/10. 1146/annurev- ento- 020117- 043348 . T eixeir a, A.C., Ribeiro, J., Morais, R., Sousa, J.J., Cunha, A., 2023. A systematic review on automatic insect detection using deep learn- ing. Agr iculture 13. URL: https://www.mdpi.com/2077- 0472/13/3/713 , doi: 10.3390/agriculture13030713 . Tian, Y ., Y e, Q., Doermann, D., 2025. Y olo12: Attention-centric real-time object detectors. arXiv preprint arXiv:2502.12524 . W agner, D.L., Grames, E.M., For ister , M.L., Berenbaum, M.R., Stopak, D., 2021. Insect decline in the anthropocene: Deat h by a thousand cuts. Proceedings of the National Academ y of Sciences 118, e2023989118. URL: https://www.pnas.org/ doi/abs/10.1073/pnas.2023989118 , doi: 10.1073/pnas.2023989118 , arXiv:https://www.pnas.org/doi/pdf/10.1073/pnas.2023989118 . W einstein, B.G., 2018. A computer vision f or animal ecology . Journal of Animal Ecology 87, 533–545. URL: https://besjournals.onlinelibrary.wiley.com/doi/abs/10.1111/ 1365- 2656.12780 , doi: https://doi.org/10.1111/1365- 2656.12780 , arXiv:https://besjournals.onlinelibrary.wiley.com/doi/pdf/10.1111/1365-2656.12780 . Wipfler , B., Pohl, H., Y a v orskay a, M.I., Beutel, R.G., 2016. A review of methods f or anal ysing insect structures — the role of morphology in t he age of ph ylogenomics. Cur rent Opinion in Insect Science 18, 60–68. URL: https://www.sciencedirect.com/science/article/pii/ S2214574516301432 , doi: https://doi.org/10.1016/j.cois.2016.09.004 . neuroscience * Special Section on Insect phy logene tics. Wu, X., Zhan, C., Lai, Y .K., Cheng, M.M., Y ang, J., 2019. Ip102: A larg e- scale benchmark dataset for insect pest recognition, in: 2019 IEEE/CVF Conf erence on Computer V ision and Pattern R ecognition (CVPR), pp. 8779–8788. doi: 10.1109/CVPR.2019.00899 . Wühr l, L., Pylatiuk, C., Giersch, M., Lapp, F., von Rintelen, T ., Balke, M., Schmidt, S., Cer retti, P ., Meier, R., 2022. Diversity scanner: Robotic handling of small invertebrates with machine lear ning methods. Molecular Ecology Resources 22, 1626–1638. URL: https://onlinelibrary.wiley.com/doi/abs/10. 1111/1755- 0998.13567 , doi: https://doi.org/10.1111/1755- 0998.13567 , arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/1755-0998.13567 . Wühr l, L., Rettenber ger , L., Meier, R., Hartop, E., Graf, J., Pylatiuk, C., 2024. Entomoscope: An open-source photomicroscope for biodiversity discov ery . IEEE A ccess 12, 11785–11794. doi: 10.1109/ACCESS.2024. 3355272 . Y u, D., van Achterberg, C., Horstmann, K., 2012. W orld Ichneumonoidea 2011: T ax onomy , Biology , Morphology and Distribution. T axapad. Y u, D.S., van Achterber g, C., Horstmann, K., 2016. W orld ichneumonoidea 2015: Tax onom y , biology , morphology and distr ibution. Taxapad 2016. Zhao, X., W ang, L., Zhang, Y ., Han, X., Dev eci, M., Par mar , M., 2024. A revie w of con v olutional neural netw orks in computer vision. Ar- tificial Intelligence Re view 57, 99. URL: https://doi.org/10.1007/ s10462- 024- 10721- 6 , doi: 10.1007/s10462- 024- 10721- 6 . Ärje, J., Raitoharju, J., Iosifidis, A., Tirronen, V ., Meissner, K., Gabbouj, M., Kiran yaz, S., Kärkkäinen, S., 2020. Human experts vs. machines in t axa recognition. Signal Processing: Imag e Communication 87, 115917. URL: https://www.sciencedirect.com/science/article/pii/ S0923596520301132 , doi: https://doi.org/10.1016/j.image.2020.115917 . J.M.H. Pinheiro et al.: Preprint submitted to Elsevier . Copyright may be transf err ed without notice. P age 14 of 14
Original Paper
Loading high-quality paper...
Comments & Academic Discussion
Loading comments...
Leave a Comment