您好,欢迎访问江苏省农业科学院 机构知识库!

A BERT-Based Hybrid Short Text Classification Model Incorporating CNN and Attention-Based BiGRU

文献类型: 外文期刊

作者: Bao, Tong 1 ; Ren, Ni 1 ; Luo, Rui 1 ; Wang, Baojia 1 ; Shen, Gengyu 1 ; Guo, Ting 1 ;

作者机构: 1.Jiangsu Acad Agr Sci, Informat Ctr, Nanjing, Peoples R China

2.Jiangsu Univ, Inst Sci & Technol Informat, Zhenjiang, Peoples R China

关键词: Deep Learning; Fusion Framework; Natural Language Processing; Short Text Classification

期刊名称:JOURNAL OF ORGANIZATIONAL AND END USER COMPUTING ( 影响因子:7.4; 五年影响因子:6.541 )

ISSN: 1546-2234

年卷期: 2021 年 33 卷 6 期

页码:

收录情况: SCI

摘要: Short text classification is a research focus for natural language processing (NLP), which is widely used in news classification, sentiment analysis, mail filtering, and other fields. In recent years, deep learning techniques are applied to text classification and have made some progress. Different from ordinary text classification, short text has the problem of less vocabulary and feature sparsity, which raise higher request for text semantic feature representation. To address this issue, this paper proposes a feature fusion framework based on the bidirectional encoder representations from transformers (BERT). In this hybrid method, BERT is used to train word vector representation. Convolutional neural network (CNN) captures static features. As a supplement, a bi-gated recurrent neural network (BiGRU) is adopted to capture contextual features. Furthermore, an attention mechanism is introduced to assign the weight of salient words. The experimental results confirmed that the proposed model significantly outperforms the other state-of-the-art baseline methods.

  • 相关文献
作者其他论文 更多>>