您好,欢迎访问北京市农林科学院 机构知识库!

Maize crop row recognition algorithm based on improved UNet network

文献类型: 外文期刊

作者: Diao, Zhihua 1 ; Guo, Peiliang 1 ; Zhang, Baohua 2 ; Zhang, Dongyan 3 ; Yan, Jiaonan 1 ; He, Zhendong 1 ; Zhao, Suna 1 ; Zhao, Chunjiang 4 ;

作者机构: 1.Zhengzhou Univ Light Ind, Sch Elect Informat Engn, Zhengzhou 450002, Peoples R China

2.Nanjing Agr Univ, Coll Artificial Intelligence, Nanjing 211800, Peoples R China

3.Anhui Univ, Natl Engn Res Ctr Agroecol Big Data Anal & Applica, Hefei 230601, Peoples R China

4.Beijing Acad Agr & Forestry Sci, Informat Technol Res Ctr, Beijing 100097, Peoples R China

关键词: Maize crop row detection; Improved UNet network; Improved vertical projection method; Least squares method

期刊名称:COMPUTERS AND ELECTRONICS IN AGRICULTURE ( 影响因子:8.3; 五年影响因子:8.3 )

ISSN: 0168-1699

年卷期: 2023 年 210 卷

页码:

收录情况: SCI

摘要: Aiming at the problem that it is difficult to identify maize crop row centerlines in complex farmland environments such as high weeds, row breaking, and leaf adhesion under different growth periods, this study proposes a centerline detection algorithm based on a improved UNet network. The UNet network - a traditional semantic segmentation network - was enhanced to create the Atrous Spatial Pyramid Pooling UNet (ASPP-UNet) network for maize crop row and background segmentation, and the improved vertical projection method was subsequently employed to measure the crop rows' feature points. Finally, the least squares method was used to fit the centerlines. Experimental results yielded the Mean Intersection Over Union, Mean Pixel Accuracy, Mean Precision, and Mean Recall metrics of ASPP-UNet network to be 83.23%, 90.18%, 91.79%, and 90.18% respectively. These figures represent respective increases of 10.03%, 11.86%, 9.43% and 11.24% compared to the Fully Convolutional Network (FCN), and 7.80%, 5.52%, 2.71%, and 5.52% compared to the UNet. Furthermore, the average fitting time and angle error of the improved vertical projection method combined with the least square method were reduced to 66 ms and 4.37 degrees, compared to 80 ms and 6.12 degrees in the traditional vertical projection method, and 86 ms and 5.67 degrees in the left and right edge centerline method. Likewise, the accuracy of proposed method increased to 92.59%, compared to 87.21% and 90.16% of the two aforementioned methods, respectively. Therefore, the proposed method successfully meets the accuracy and real-time demands of agricultural robot vision navigation, and can function effectively under varying environmental pressures.

  • 相关文献
作者其他论文 更多>>