期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
基于Good-Turing平滑SO-PMI算法构建微博情感词典方法的研究 被引量:5
1
作者 姜伶伶 何中市 张航 《现代计算机》 2018年第7期15-20,共6页
微博情感词典的构建在微博情感分析中具有重要的研究意义和使用价值。针对现有公共情感词典对微博中情感词覆盖率较低的问题,以HowNet和大连理工大学情感本体作为微博基础情感词典,提出一种基于Good-Turing平滑的SO-PMI算法,针对近年来... 微博情感词典的构建在微博情感分析中具有重要的研究意义和使用价值。针对现有公共情感词典对微博中情感词覆盖率较低的问题,以HowNet和大连理工大学情感本体作为微博基础情感词典,提出一种基于Good-Turing平滑的SO-PMI算法,针对近年来出现的大量网络情感词汇进行情感倾向性的判断,并与基础情感词典融合构建微博领域情感词典。最后采用基于规则的方法判断微博文本的情感倾向性。实验结果验证上述方法构建的微博领域情感词典在微博情感分类中的有效性和准确性。 展开更多
关键词 中文微博 情感词典 情感分类 good-turing SO-PMI 平滑
在线阅读 下载PDF
An Empirical Study of Good-Turing Smoothing for Language Models on Different Size Corpora of Chinese 被引量:5
2
作者 Feng-Long Huang Ming-Shing Yu Chien-Yo Hwang 《Journal of Computer and Communications》 2013年第5期14-19,共6页
Data sparseness has been an inherited issue of statistical language models and smoothing method is usually used to resolve the zero count problems. In this paper, we studied empirically and analyzed the well-known smo... Data sparseness has been an inherited issue of statistical language models and smoothing method is usually used to resolve the zero count problems. In this paper, we studied empirically and analyzed the well-known smoothing methods of Good-Turing and advanced Good-Turing for language models on large sizes Chinese corpus. In the paper, ten models are generated sequentially on various size of corpus, from 30 M to 300 M Chinese words of CGW corpus. In our experiments, the smoothing methods;Good-Turing and Advanced Good-Turing smoothing are evaluated on inside testing and outside testing. Based on experiments results, we analyzed further the trends of perplexity of smoothing methods, which are useful for employing the effective smoothing methods to alleviate the issue of data sparseness on various sizes of language models. Finally, some helpful observations are described in detail. 展开更多
关键词 good-turing Methods SMOOTHING LANGUAGE Models PERPLEXITY
暂未订购
上一页 1 下一页 到第
使用帮助 返回顶部