摘要
设{X_n,n≥0}是在S={1,2,…N}中取值的可测函数列,P、Q是测度空间上的两个概率测度,其中Q关于{X_n,n≥0}是马氏测度.本文引进了P关于Q的样本散度率距离的概念,并利用这个概念得到了任意信源二元函数一类平均值的小偏差定理,作为推论得到了任意信源熵密度的小偏差定理.最后我们将Shannon-McMillan定理推广到非齐次马氏信源情形.
Let {Xn, n≥0} be a sequence of measurable functions taking their values in the alphabet S = {1,2,…, N}. Let P,Q be two probability measures on the measurable space, such that {Xn,n ≥ 0} is Markovian under Q, Let h(P \ Q) = limsupn-1 log[P(X0,…, Xn)/Q(X0,…, Xn)} be the sample divergence-rate distance
n→∞
of P relative to Q. In this paper, a class of small deviations theorems for the averages of the functions of two variables of an arbitrary information sources are discussed by using the concept h(P \ Q), and, as a corollary, a small deviations theorem for the entropy densities of arbitrary information sources is obtained. Finally, an extension of Shannon-McMillan Theorem on the case of nonhomogeneous Markov information sources is given.
出处
《数学学报(中文版)》
SCIE
CSCD
北大核心
1997年第1期22-36,共15页
Acta Mathematica Sinica:Chinese Series
基金
河北省自然科学基金项目
关键词
小偏差定理
熵
概率测度
信源
马氏信源
Small-deviations theorem, Entropy, Entropy density, Sample divergence-rate distance, Shannon-McMillan theorem