期刊文献+

基于ViLBERT与BiLSTM的图像描述算法 被引量:1

Image Caption Algorithm Based on ViLBERT and BiLSTM
在线阅读 下载PDF
导出
摘要 传统图像描述算法存在提取图像特征利用不足、缺少上下文信息学习和训练参数过多的问题,提出基于ViLBERT和双层长短期记忆网络(BiLSTM)结合的图像描述算法.使用ViLBERT作为编码器, ViLBERT模型能将图片特征和描述文本信息通过联合注意力的方式进行结合,输出图像和文本的联合特征向量.解码器使用结合注意力机制的BiLSTM来生成图像描述.该算法在MSCOCO2014数据集进行训练和测试,实验评价标准BLEU-4和BLEU得分分别达到36.9和125.2,优于基于传统图像特征提取结合注意力机制图像描述算法.通过生成文本描述对比可看出,该算法生成的图像描述能够更细致地表述图片信息. Traditional image captioning has the problems of the under-utilization of extracted image features, the lack of context information learning and too many training parameters. This study proposes an image captioning algorithm based on Vision-and-Language BERT(ViLBERT) and Bidirectional Long Short-Term Memory network(BiLSTM). The ViLBERT model is used as an encoder, which can combine image features and descriptive text information through the co-attention mechanism and output the joint feature vector of image and text. The decoder uses a BiLSTM combined with attention mechanism to generate image caption. The algorithm is trained and tested on MSCOCO2014, and the scores of evaluation criteria BLEU-4 and BLEU are 36.9 and 125.2 respectively. This indicates that the proposed algorithm is better than the image captioning based on the traditional image feature extraction combined with the attention mechanism. The comparison of generated text descriptions demonstrates that the image caption generated by this algorithm can describe the image information in more detail.
作者 许昊 张凯 田英杰 种法广 王子超 XU Hao;ZHANG Kai;TIAN Ying-Jie;CHONG Fa-Guang;WANG Zi-Chao(College of Computer Science and Technology,Shanghai University of Electric Power,Shanghai 200090,China;Shanghai Electrical Research Institute,State Grid Corporation of China,Shanghai 200437,China)
出处 《计算机系统应用》 2021年第11期195-202,共8页 Computer Systems & Applications
基金 国家自然科学基金(61872230,61802248,61802249) 上海高校青年教师培养资助计划(ZZsdl18006)。
关键词 图像描述 ViLBERT BiLSTM 注意力机制 image caption Vision-and-Language BERT(ViLBERT) Bidirectional Long Short-Term Memory(BiLSTM) attention mechanism
  • 相关文献

同被引文献1

引证文献1

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部