Identifying nutritional deficiencies at an early stage is crucial for maximizing yield production and ensuring healthy plants. Conventional methods generally rely on time-consuming analysis conducted by agronomic experts. To address this challenge, this study presents a data-driven approach for the early identification of nutritional deficiencies in hazelnut orchards. Different custom datasets, composed of images acquired in a real hazelnut orchard as well as in a controlled laboratory environment, are collected, and the performance of five state-of-the-art machine learning models in early detecting nutritional deficiencies is compared. In particular, ResNet, DenseNet, MobileNet, EfficientNet, and ConvNext models, along with a baseline based on support vector machines, are considered. Data augmentation techniques are introduced to synthetically increase the datasets, and their effectiveness is extensively evaluated. Additionally, a pipeline is designed to carry out the early identification of nutritional deficiencies onboard an agricultural robot. Experimental results on the early identification show that ConvNext achieves the highest performance: 81.79% accuracy and 0.8168 F1 score on a real-world dataset with four classes, and 75.54% accuracy with 0.7552 F1 score for the more challenging six-class scenario. Furthermore, the effectiveness of the integrated system is validated in preliminary laboratory experiments using a Turtlebot2 mobile base and a Franka Research 3 arm, equipped with RGB-D cameras.

Fuoti, F., Lippi, M., Rabbai, A., Miele, A., Bonucci, N., Cristofori, V., et al. (2026). Enabling early identification of nutritional deficiencies in hazelnut orchards through a data-driven robotic framework. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 246 [10.1016/j.compag.2026.111560].

Enabling early identification of nutritional deficiencies in hazelnut orchards through a data-driven robotic framework

Fuoti F.;Lippi M.;Miele A.;Bonucci N.;Gasparri A.
2026-01-01

Abstract

Identifying nutritional deficiencies at an early stage is crucial for maximizing yield production and ensuring healthy plants. Conventional methods generally rely on time-consuming analysis conducted by agronomic experts. To address this challenge, this study presents a data-driven approach for the early identification of nutritional deficiencies in hazelnut orchards. Different custom datasets, composed of images acquired in a real hazelnut orchard as well as in a controlled laboratory environment, are collected, and the performance of five state-of-the-art machine learning models in early detecting nutritional deficiencies is compared. In particular, ResNet, DenseNet, MobileNet, EfficientNet, and ConvNext models, along with a baseline based on support vector machines, are considered. Data augmentation techniques are introduced to synthetically increase the datasets, and their effectiveness is extensively evaluated. Additionally, a pipeline is designed to carry out the early identification of nutritional deficiencies onboard an agricultural robot. Experimental results on the early identification show that ConvNext achieves the highest performance: 81.79% accuracy and 0.8168 F1 score on a real-world dataset with four classes, and 75.54% accuracy with 0.7552 F1 score for the more challenging six-class scenario. Furthermore, the effectiveness of the integrated system is validated in preliminary laboratory experiments using a Turtlebot2 mobile base and a Franka Research 3 arm, equipped with RGB-D cameras.
2026
Fuoti, F., Lippi, M., Rabbai, A., Miele, A., Bonucci, N., Cristofori, V., et al. (2026). Enabling early identification of nutritional deficiencies in hazelnut orchards through a data-driven robotic framework. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 246 [10.1016/j.compag.2026.111560].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11590/536896
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact