This paper investigates and discusses the use of information divergence,through the widely used Kullback–Leibler(KL)divergence,under the multivariate(generalized)γ-order normal distribution(γ-GND).The behavior of t...This paper investigates and discusses the use of information divergence,through the widely used Kullback–Leibler(KL)divergence,under the multivariate(generalized)γ-order normal distribution(γ-GND).The behavior of the KL divergence,as far as its symmetricity is concerned,is studied by calculating the divergence of γ-GND over the Student’s multivariate t-distribution and vice versa.Certain special cases are also given and discussed.Furthermore,three symmetrized forms of the KL divergence,i.e.,the Jeffreys distance,the geometric-KL as well as the harmonic-KL distances,are computed between two members of the γ-GND family,while the corresponding differences between those information distances are also discussed.展开更多
文摘This paper investigates and discusses the use of information divergence,through the widely used Kullback–Leibler(KL)divergence,under the multivariate(generalized)γ-order normal distribution(γ-GND).The behavior of the KL divergence,as far as its symmetricity is concerned,is studied by calculating the divergence of γ-GND over the Student’s multivariate t-distribution and vice versa.Certain special cases are also given and discussed.Furthermore,three symmetrized forms of the KL divergence,i.e.,the Jeffreys distance,the geometric-KL as well as the harmonic-KL distances,are computed between two members of the γ-GND family,while the corresponding differences between those information distances are also discussed.