摘要
针对传统的命名实体识别方法存在严重依赖大量人工特征、领域知识和分词效果,以及未充分利用词序信息等问题,提出了一种基于双向门控循环单元(BGRU)神经网络结构的命名实体识别模型。该模型利用外部数据,通过在大型自动分词文本上预先训练词嵌入词典,将潜在词信息整合到基于字符的BGRU-CRF中,充分利用了潜在词的信息,提取了上下文的综合信息,并更加有效地避免了实体歧义。此外,利用注意力机制来分配BGRU网络结构中特定信息的权重,从句子中选择最相关的字符和单词,有效地获取了特定词语在文本中的长距离依赖关系,识别信息表达的分类,对命名实体进行识别。该模型明确地利用了词与词之间的序列信息,并且不受分词错误的影响。实验结果表明,与传统的序列标注模型以及神经网络模型相比,所提模型在数据集MSRA上实体识别的总体F1值提高了3.08%,所提模型在数据集OntoNotes上的实体识别的总体F1值提高了0.16%。
Aiming at the problem that the traditional named entity recognition method relies heavily on plenty of hand-crafted features,domain knowledge,word segmentation effect,and does not make full use of word order information, anamed entity recognition model based on BGRU(bidirectional gated recurrent unit) was proposed.This model utilizes external data and integrates potential word information into character-based BGRU-CRF by pre-training words into dictionaries on large automatic word segmentation texts,making full use of the information of potentialwords,extracting comprehensive information of context,and more effectively avoiding ambiguity of entity.In addition,attention mechanism is used to allocate the weight of specific information in BGRU network structure,which can select the most relevant characters and words from the sentence, effectively obtain long-distance dependence of specific words in the text ,recognize the classification of information expression,and identify named entities.The model explicitly uses the sequence information between words,and is not affected by word segmentation errors.Compared with the traditional sequence labeling model and the neural network model,the experimental results on MSRA and OntoNotes show that the proposed model is 3.08% and 0.16% higher than the state-of-art complaint models on the overall F 1 value respectively.
作者
石春丹
秦岭
SHI Chun-dan;QIN Lin(School of Computer Science and Technology,Nanjing Tech University,Nanjing 211816,China)
出处
《计算机科学》
CSCD
北大核心
2019年第9期237-242,共6页
Computer Science
关键词
命名实体识别
双向门控循环单元
注意力机制
Named entity recognition
Bidirectional gated recurrent unit
Attention mechanism