期刊文献+

任意信源与马氏信源比较及小偏差定理 被引量:22

The Comparison between Arbitrary Information Sources and Nonhomogeneous Markov Information Sources and the Small Deviations Theorems
原文传递
导出
摘要 设{X_n,n≥0}是在S={1,2,…N}中取值的可测函数列,P、Q是测度空间上的两个概率测度,其中Q关于{X_n,n≥0}是马氏测度.本文引进了P关于Q的样本散度率距离的概念,并利用这个概念得到了任意信源二元函数一类平均值的小偏差定理,作为推论得到了任意信源熵密度的小偏差定理.最后我们将Shannon-McMillan定理推广到非齐次马氏信源情形. Let {Xn, n≥0} be a sequence of measurable functions taking their values in the alphabet S = {1,2,…, N}. Let P,Q be two probability measures on the measurable space, such that {Xn,n ≥ 0} is Markovian under Q, Let h(P \ Q) = limsupn-1 log[P(X0,…, Xn)/Q(X0,…, Xn)} be the sample divergence-rate distance n→∞ of P relative to Q. In this paper, a class of small deviations theorems for the averages of the functions of two variables of an arbitrary information sources are discussed by using the concept h(P \ Q), and, as a corollary, a small deviations theorem for the entropy densities of arbitrary information sources is obtained. Finally, an extension of Shannon-McMillan Theorem on the case of nonhomogeneous Markov information sources is given.
作者 刘文 杨卫国
出处 《数学学报(中文版)》 SCIE CSCD 北大核心 1997年第1期22-36,共15页 Acta Mathematica Sinica:Chinese Series
基金 河北省自然科学基金项目
关键词 小偏差定理 概率测度 信源 马氏信源 Small-deviations theorem, Entropy, Entropy density, Sample divergence-rate distance, Shannon-McMillan theorem
  • 相关文献

参考文献1

  • 1Chung K L,Ann Math Stat,1961年,32卷,612页

同被引文献105

引证文献22

二级引证文献24

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部