DoubleNet: A Method for Generating Navigation Lines of Unstructured Soil Roads in a Vineyard Based on CNN and Transformer

文献类型: 外文期刊

第一作者: Cui, Xuezhi

作者: Cui, Xuezhi;Zhu, Licheng;Zhao, Bo;Wang, Ruixue;Han, Zhenhao;Lu, Kunlei;Feng, Xuguang;Ni, Jipeng;Cui, Xiaoyi;Cui, Xuezhi;Zhu, Licheng;Zhao, Bo;Wang, Ruixue;Han, Zhenhao;Lu, Kunlei;Feng, Xuguang;Ni, Jipeng;Cui, Xiaoyi

作者机构:

关键词: orchard navigation; unstructured road; convolution; Fused-MHSA

期刊名称:AGRONOMY-BASEL ( 影响因子:3.4; 五年影响因子:3.8 )

ISSN:

年卷期: 2025 年 15 卷 3 期

页码:

收录情况: SCI

摘要: Navigating unstructured roads in vineyards with weak satellite signals presents significant challenges for robotic systems. This research introduces DoubleNet, an innovative deep-learning model designed to generate navigation lines for such conditions. To improve the model's ability to extract image features, DoubleNet incorporates several key innovations, such as a unique multi-head self-attention mechanism (Fused-MHSA), a modified activation function (SA-GELU), and a specialized operation block (DNBLK). Based on them, DoubleNet is structured as an encoder-decoder network that includes two parallel subnetworks: one dedicated to processing 2D feature maps and the other focused on 1D tensors. These subnetworks interact through two feature fusion networks, which operate in both the encoder and decoder stages, facilitating a more integrated feature extraction process. Additionally, we utilized a specially annotated dataset comprising images fused with RGB and mask, with five navigation points marked to enhance the accuracy of point localization. As a result of these innovations, DoubleNet achieves a remarkable 95.75% percentage of correct key points (PCK) and operates at 71.16 FPS on our dataset, with a combined performance that outperformed several well-known key point detection algorithms. DoubleNet demonstrates strong potential as a competitive solution for generating effective navigation routes for robots operating in vineyards with unstructured roads.

分类号:

  • 相关文献
作者其他论文 更多>>