您好,欢迎访问北京市农林科学院 机构知识库!

Dynamic UAV data fusion and deep learning for improved maize phenological-stage tracking

文献类型: 外文期刊

作者: Feng, Ziheng 1 ; Zhao, Jiliang 1 ; Suo, Liunan 1 ; Sun, Heguang 2 ; Long, Huiling 2 ; Yang, Hao 2 ; Song, Xiaoyu 2 ; Feng, Haikuan 2 ; Xu, Bo 2 ; Yang, Guijun 2 ; Zhao, Chunjiang 2 ;

作者机构: 1.Henan Agr Univ, Agron Coll, State Key Lab Wheat & Maize Crop Sci, Zhengzhou 450046, Henan, Peoples R China

2.Beijing Acad Agr & Forestry Sci, Informat Technol Res Ctr, Key Lab Quantitat Remote Sensing Agr, Minist Agr & Rural Affairs, Beijing 100097, Peoples R China

关键词: Near real-time; Maize phenology; Deep learning; UAV; Multi-source data fusion

期刊名称:CROP JOURNAL ( 影响因子:5.6; 五年影响因子:6.0 )

ISSN: 2095-5421

年卷期: 2025 年 13 卷 3 期

页码:

收录情况: SCI

摘要: Near real-time maize phenology monitoring is crucial for field management, cropping system adjustments, and yield estimation. Most phenological monitoring methods are post-seasonal and heavily rely on high-frequency time-series data. These methods are not applicable on the unmanned aerial vehicle (UAV) platform due to the high cost of acquiring time-series UAV images and the shortage of UAV-based phenological monitoring methods. To address these challenges, we employed the Synthetic Minority Oversampling Technique (SMOTE) for sample augmentation, aiming to resolve the small sample modelling problem. Moreover, we utilized enhanced "separation" and "compactness" feature selection methods to identify input features from multiple data sources. In this process, we incorporated dynamic multi-source data fusion strategies, involving Vegetation index (VI), Color index (CI), and Texture features (TF). A two-stage neural network that combines Convolutional Neural Network (CNN) and Long Short-Term Memory Network (LSTM) is proposed to identify maize phenological stages (including sowing, seedling, jointing, trumpet, tasseling, maturity, and harvesting) on UAV platforms. The results indicate that the dataset generated by SMOTE closely resembles the measured dataset. Among dynamic data fusion strategies, the VI-TF combination proves to be most effective, with CI-TF and VI-CI combinations following behind. Notably, as more data sources are integrated, the model's demand for input features experiences a significant decline. In particular, the CNN-LSTM model, based on the fusion of three data sources, exhibited remarkable reliability when validating the three datasets. For Dataset 1 (Beijing Xiaotangshan, 2023: Data from 12 UAV Flight Missions), the model achieved an overall accuracy (OA) of 86.53%. Additionally, its precision (Pre), recall (Rec), F1 score (F1), false acceptance rate (FAR), and false rejection rate (FRR) were 0.89, 0.89, 0.87, 0.11, and 0.11, respectively. The model also showed strong generalizability in Dataset 2 (Beijing Xiaotangshan, 2023: Data from 6 UAV Flight Missions) and Dataset 3 (Beijing Xiaotangshan, 2022: Data from 4 UAV Flight Missions), with OAs of 89.4% and 85%, respectively. Meanwhile, the model has a low demand for input features, requiring only 54.55% (99 of all features). The findings of this study not only offer novel insights into near real-time crop phenology monitoring, but also provide technical support for agricultural field management and cropping system adaptation. (c) 2025 Crop Science Society of China and Institute of Crop Science, CAAS. Production and hosting by Elsevier B.V. on behalf of KeAi Communications Co., Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

  • 相关文献
作者其他论文 更多>>