Several factors may compromise the effectiveness of algorithms for relatively localizing specific objects in outdoor unstructured environments using robotic platforms, such as the complexity of the environment, and changes in lighting conditions. Consequently, methods that rely solely on instantaneous detection may not be reliable in such application scenarios. In this work, we propose an architecture that utilizes an RGB-D camera mounted on a mobile robot and combines a state-of-the-art detection system with a purposely designed tracking algorithm. Specifically, we employ the latest You Only Look Once (YOLO) version to detect and segment the target in the image. We extract relevant relative information of the robot with respect to the object, i.e., its position and the relative orientation, by exploiting the depth map. Finally, we design an Extended Kalman Filter to track this relative information while taking into account the robot kinematic model. We implement this architecture in the ROS middleware and validate it within a precision agriculture setting for trap monitoring in a pest detection system.

Arlotta, A., Lippi, M., Gasparri, A. (2023). A ROS-based Architecture for Object Detection and Relative Localization for a Mobile Robot with an Application to a Precision Farming Scenario. In 2023 31st Mediterranean Conference on Control and Automation, MED 2023 (pp.131-136). Institute of Electrical and Electronics Engineers Inc. [10.1109/MED59994.2023.10185904].

A ROS-based Architecture for Object Detection and Relative Localization for a Mobile Robot with an Application to a Precision Farming Scenario

Lippi M.;Gasparri A.
2023-01-01

Abstract

Several factors may compromise the effectiveness of algorithms for relatively localizing specific objects in outdoor unstructured environments using robotic platforms, such as the complexity of the environment, and changes in lighting conditions. Consequently, methods that rely solely on instantaneous detection may not be reliable in such application scenarios. In this work, we propose an architecture that utilizes an RGB-D camera mounted on a mobile robot and combines a state-of-the-art detection system with a purposely designed tracking algorithm. Specifically, we employ the latest You Only Look Once (YOLO) version to detect and segment the target in the image. We extract relevant relative information of the robot with respect to the object, i.e., its position and the relative orientation, by exploiting the depth map. Finally, we design an Extended Kalman Filter to track this relative information while taking into account the robot kinematic model. We implement this architecture in the ROS middleware and validate it within a precision agriculture setting for trap monitoring in a pest detection system.
2023
979-8-3503-1543-1
Arlotta, A., Lippi, M., Gasparri, A. (2023). A ROS-based Architecture for Object Detection and Relative Localization for a Mobile Robot with an Application to a Precision Farming Scenario. In 2023 31st Mediterranean Conference on Control and Automation, MED 2023 (pp.131-136). Institute of Electrical and Electronics Engineers Inc. [10.1109/MED59994.2023.10185904].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11590/448068
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
social impact