GACDNet:Mapping winter wheat by generative adversarial cross-domain networks with transformer integration for zero-sample extraction

文献类型: 外文期刊

第一作者: Wang, Chunyang

作者: Wang, Chunyang;Gu, Yanan;Xu, Zhaozhao;Wang, Chunyang;Li, Kai;Zhao, Zongze;Yang, Wei;Wang, Xinbing;Wang, Jian

作者机构:

关键词: Domain generalization; Contrast learning; Cross-domain; Image classification; Winter wheat

期刊名称:COMPUTERS AND ELECTRONICS IN AGRICULTURE ( 影响因子:8.3; 五年影响因子:8.3 )

ISSN: 0168-1699

年卷期: 2024 年 221 卷

页码:

收录情况: SCI

摘要: Accurate extraction of winter wheat area is essential for wheat yield estimation. Remotely sensed images have limited coverage, are taken at different times, from different angles and in different geographical areas, and the spectral information of the same feature varies from image to image. Machine learning methods are very popular in extracting winter wheat planting area, there are differences between different images, namely the distribution differences between source domain data and target domain data, the results obtained by these methods directly applied to other areas are not satisfactory. To achieve cross regional extraction of winter wheat area, we propose generative adversarial cross-domain networks for image classification under the same image type. The crossdomain network proposed comprises a generative network and a feature extractor. The generative network creates diverse and acceptable samples from the initial input data, constrained by the contrast loss. The feature extractor converts the initial input data and the generated data into a high-level representation. Two time series datasets were constructed and a series of experiments were conducted on these datasets, obtaining high -precision and optimal classification results. Sample training was performed using 70%, 50%, 30% and 10% of the data on both datasets. The study results demonstrated the high performance and cross-domain capability of our network. Furthermore, the winter wheat cultivation area of the entire Zhoukou city was extracted, yielding results with an accuracy of 94.23%. Extensive experiments and applications on the dataset demonstrate that our method is feasible and reliable.

分类号:

  • 相关文献
作者其他论文 更多>>