期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Information Divergence and the Generalized Normal Distribution:A Study on Symmetricity
1
作者 Thomas L.Toulias Christos P.Kitsos 《Communications in Mathematics and Statistics》 SCIE 2021年第4期439-465,共27页
This paper investigates and discusses the use of information divergence,through the widely used Kullback–Leibler(KL)divergence,under the multivariate(generalized)γ-order normal distribution(γ-GND).The behavior of t... This paper investigates and discusses the use of information divergence,through the widely used Kullback–Leibler(KL)divergence,under the multivariate(generalized)γ-order normal distribution(γ-GND).The behavior of the KL divergence,as far as its symmetricity is concerned,is studied by calculating the divergence of γ-GND over the Student’s multivariate t-distribution and vice versa.Certain special cases are also given and discussed.Furthermore,three symmetrized forms of the KL divergence,i.e.,the Jeffreys distance,the geometric-KL as well as the harmonic-KL distances,are computed between two members of the γ-GND family,while the corresponding differences between those information distances are also discussed. 展开更多
关键词 Kullback-Leibler divergence Jeffreys distance resistor-average distance Multivariateγ-order normal distribution Multivariate Student’s t-distribution Multivariate Laplace distribution
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部