Abstract:
Collecting and analyzing hyperspectral imagery (HSI) of plant roots over time can enhance our understanding of their function, responses to environmental factors, turnover, and relationship with the rhizosphere. Current belowground red-green-blue (RGB) root imaging studies infer such functions from physical properties like root length, volume, and surface area. HSI provides a more complete spectral perspective of plants by capturing a high-resolution spectral signature of plant parts, which have extended studies beyond physical properties to include physiological properties, chemical composition, and phytopathology. Understanding crop plants’ physical, physiological, and chemical properties enables researchers to determine high-yielding, drought-resilient genotypes that can withstand climate changes and sustain future population needs. However, most HSI plant studies use cameras positioned above ground, and thus, similar belowground advances are urgently needed. One reason for the sparsity of belowground HSI studies is that root features often have limited distinguishing reflectance intensities compared to surrounding soil, potentially rendering conventional image analysis methods ineffective. In the field of machine learning (ML), there are currently no publicly available datasets containing the heavy correlation, highly textured background, and thin features characteristic of belowground root systems. Here we present HyperPRI, a novel dataset containing RGB and HSI data for in situ, non-destructive, underground plant root analysis using ML tools. HyperPRI contains images of plant roots grown in rhizoboxes for two annual crop species – peanut (Arachis hypogaea) and sweet corn (Zea mays). Drought conditions are simulated once, and the boxes are imaged and weighed on select days across two months. Along with the images, we provide hand-labeled semantic masks and imaging environment metadata. Additionally, we present baselines for root segmentation on this dataset and draw comparisons between methods that focus on spatial, spectral, and spatial-spectral features to predict the pixel-wise labels. Results demonstrate that combining HyperPRI’s hyperspectral and spatial information improves semantic segmentation of target objects.
Links:
Citation:
S. J. Chang, R. Chowdhry, Y. Song, T. Mejia, A. Hampton, S. Kucharski, T. M. Sazzad, Y. Zhang, S. J. Koppal, C. H. Wilson, S. Gerber, B. Tillman, M. F. R. Resende, W. M. Hammond, A. Zare "HyperPRI: A Dataset of Hyperspectral Images for Underground Plant Root Study", doi: https://doi.org/10.1101/2023.09.29.559614, 2023.
@article {Chang2023.09.29.559614,
author = {Spencer J. Chang and Ritesh Chowdhry and Yangyang Song and Tomas Mejia and Anna Hampton and Shelby Kucharski and TM Sazzad and Yuxuan Zhang and Sanjeev J. Koppal and Chris H. Wilson and Stefan Gerber and Barry Tillman and Marcio F. R. Resende and William M. Hammond and Alina Zare},
title = {HyperPRI: A Dataset of Hyperspectral Images for Underground Plant Root Study},
elocation-id = {2023.09.29.559614},
year = {2023},
doi = {10.1101/2023.09.29.559614},
publisher = {Cold Spring Harbor Laboratory},
URL = {https://www.biorxiv.org/content/early/2023/09/30/2023.09.29.559614},
eprint = {https://www.biorxiv.org/content/early/2023/09/30/2023.09.29.559614.full.pdf},
journal = {bioRxiv}
}