loading page

Estimating corn plant nitrogen content from hyperspectral images acquired using UAV and LeafSpec in the field
  • +12
  • Jose C. Tovar,
  • Tannor Mulford,
  • S. Vahid Mirnezami,
  • Dongdong Ma,
  • Sourav Bhadra,
  • Nima Hamidi Ghalehjegh,
  • Maede Zolanvari,
  • Jeffrey Berry,
  • Allen Mansholt,
  • Ashten Kimble,
  • Jay Phillips,
  • Josh Kinser,
  • Ann Keller,
  • Matthew Maimaitiyiming,
  • Xiaobo Zhou
Jose C. Tovar

Corresponding Author:josectovar@gmail.com

Author Profile
Tannor Mulford
S. Vahid Mirnezami
Dongdong Ma
Sourav Bhadra
Nima Hamidi Ghalehjegh
Maede Zolanvari
Jeffrey Berry
Allen Mansholt
Ashten Kimble
Jay Phillips
Josh Kinser
Ann Keller
Matthew Maimaitiyiming
Xiaobo Zhou

Abstract

Nitrogen fertilizers are one of the top expenses for corn farmers in North America, and the highest cost of any input over a growing season. The success of developing better strategies for nitrogen-use efficiency, such as improved varieties and biologics, depends on an efficient way of measuring in-planta nitrogen content. We are testing a reflectance-based UAV hyperspectral approach and a transmittance-based hyperspectral handheld called LeafSpec to estimate corn nitrogen from hyperspectral images. The UAV approach is high throughput but with relatively low spatial resolution and has high susceptibility to environmental factors. The LeafSpec is low throughput but with high spatial resolution and is not significantly affected by environmental factors. Also, the LeafSpec can account for the nitrogen inside the corn leaves by using transmittance, whereas the UAV can only see surface effects from reflectance. We are testing well-established machine learning and state-of-the-art deep learning models for both approaches. This presentation will share our learnings from testing UAV and LeafSpec as two hyperspectral imaging approaches to estimate corn plant nitrogen from hyperspectral images regarding potential use-cases for each approach, considering precision, ease of use, throughput, and potential for further development.
27 Oct 2023Submitted to NAPPN 2024
30 Oct 2023Published in NAPPN 2024