loading page

Low-Cost Photogrammetry Rig for 3D Crop Modelling and Plant Phenomics
  • +3
  • Joe Hrzich,
  • Christopher P. Bidinosti,
  • Michael A. Beck,
  • Christopher J. Henry,
  • Kalhari Manawasinghe,
  • Karen Tanino
Joe Hrzich
University of Winnipeg

Corresponding Author:hrzich-j@webmail.uwinnipeg.ca

Author Profile
Christopher P. Bidinosti
University of Winnipeg
Michael A. Beck
University of Winnipeg
Christopher J. Henry
University of Manitoba
Kalhari Manawasinghe
University of Saskatchewan
Karen Tanino
University of Saskatchewan

Abstract

Photogrammetry is the science of obtaining a 3D scan of an object. Through this process, reliable information about the physical object's complex structure can be obtained, studied and analysed. A low-cost Structure from Motion (SfM) technique can be used to create 3D models using multiple 2D images from different viewpoints. A point cloud is a widely used 3D data form, which can be produced by depth sensors, such as LIDARs and RGB-D cameras. However, the cost of such scanners can be prohibitive, putting photogrammetry out of reach for many researchers and practitioners in the agriculture industry. We are developing a low-cost close-range photogrammetry rig that could be a beneficial tool for agronomists, plant scientists, and breeders. Our imaging system utilizes the Raspberry Pi to capture images with multiple cameras, and a commercial rotatory table to get images from different viewpoints. We discuss the development of extracting quantitative trait indices in wheat in order to automatically characterize planophile versus erectophile canopy architectures. Moving forward, we plan to use our photogrammetry rig for a variety of applications such as growth monitoring and extracting plant traits such as number of leaves, stem height, leaf length, leaf width, leaf area, and canopy volume. We also plan on developing bespoke, plug-and-play systems that are tailored to the specific needs of a researcher and can be operated with minimal expertise.
06 Oct 2023Submitted to NAPPN 2024
14 Oct 2023Published in NAPPN 2024