loading page

Robust Phenotyping Robot with Tactile Feedback
  • +8
  • Ziling Chen,
  • Xuan Li,
  • Colin Young,
  • Tianzhang Zhao,
  • Clayton Ness,
  • Nathanael Jahn Elkin,
  • Santiago Canova,
  • Maximus Timothy Shurr,
  • Raghava Sai Uppuluri,
  • Yu She,
  • Jian Jin
Ziling Chen

Corresponding Author:archer.tavish.c@gmail.com

Author Profile
Xuan Li
Colin Young
Tianzhang Zhao
Clayton Ness
Nathanael Jahn Elkin
Santiago Canova
Maximus Timothy Shurr
Raghava Sai Uppuluri
Yu She
Jian Jin

Abstract

Modern agriculture increasingly relies on non-invasive techniques for plant health monitoring. Touch-based proximal hyperspectral imaging (HSI), where sensors directly clamp onto leaves, offers intricate plant details but faces distinct challenges. Current methods lack real-time awareness of the pressure and alignment between the sensor and the leaf surface. This blind interaction could lead to potential damage to the leaf or even affect internal cellular structures, which could alter the hyperspectral readings. In this research, we introduce the integration of a tactile sensor with a robotic system specialized for touch-based proximal HSI of corn. This fusion of HSI with the GelSight sensor provides high-resolution tactile feedback, crucial for gentle plant navigation and precise positioning. This tactile guidance ensures the robotic system applies consistent and safe pressure on the corn, reducing the risk of unintended damage and improving the consistency in hyperspectral data. Experiments showcased the efficacy of this integration. Results highlighted a significant improvement in imaging data quality, scanning success rate, and consistency, with the tactile sensor reducing inadvertent plant disturbances compared to non-tactile methods. In conclusion, the seamless merger of HSI with tactile feedback represents a significant advancement in precision agriculture. This integrated approach promises enhanced plant health monitoring, fostering sustainable farming and increased yield potential.
30 Oct 2023Submitted to NAPPN 2024
30 Oct 2023Published in NAPPN 2024