image

WheatPheno

A well-labeled, rich-diversity, and large-volume dataset (WheatPheno), including 164,411 images of 160 varieties at five phenophases under various environmental conditions, was constructed and publicly provided.

Author

Ruinan Zhang, Shichao Jin, Yuanhao Zhang, Jingrong Zang, Yu Wang, Qing Li, Zhuangzhuang Sun, Xiao Wang, Qin Zhou, Jian Cai, Shan Xu, Yanjun Su, Jin Wu, Dong Jiang

Abstract

The real-time monitoring of wheat phenology variations among different varieties and their adaptive responses to environmental conditions is essential for advancing breeding efforts and improving cultivation management. Many remote sensing efforts have been made to relieve the challenges of key phenophase detection. However, existing solutions are not accurate enough to discriminate adjacent phenophases with subtle organ changes, and they are not real-time, such as the vegetation index curve-based methods relying on entire growth stage data after the experiment was finished. Furthermore, it is key to improving the efficiency, scalability, and availability of phenological studies. This study proposes a two-stage deep learning framework called PhenoNet for the accurate, efficient, and real-time classification of key wheat phenophases. PhenoNet comprises a lightweight encoder module (PhenoViT) and a long short-term memory (LSTM) module. The performance of PhenoNet was assessed using a well-labeled, multi-variety, and large-volume dataset (WheatPheno). The results show that PhenoNet achieved an overall accuracy (OA) of 0.945, kappa coefficients (Kappa) of 0.928, and F1-score (F1) of 0.941. Additionally, the network parameters (Params), number of operations measured by multiply-adds (MAdds), and graphics processing unit memory required for classification (Memory) were 0.889 million (M), 0.093 Giga times (G), and 8.0 Megabytes (MB), respectively. PhenoNet outperformed eleven state-of-the-art deep learning networks, achieving an average improvement of 3.7% in OA, 5.1% in Kappa, and 4.1% in F1, while reducing average Params, MAdds, and Memory by 78.4%, 85.0%, and 75.1%, respectively. The feature visualization and ablation analysis explained that PhenoNet mainly benefited from using time-series information and lightweight modules. Furthermore, PhenoNet can be effectively transferred across years, achieving a high OA of 0.981 using a two-stage transfer learning strategy. Furthermore, an extensible web platform that integrates WheatPheno and PhenoNet and ensures that the work done in this study is accessible, interoperable, and reusable has been developed (https://phenonet.org/).

Paper Link

https://doi.org/10.1016/j.isprsjprs.2024.01.006

  • Dataset link: First part (temporal)

    * The complete dataset of wheatpheno is currently being uploaded. Now you can download the temporal version for testing purposes.