Wednesday, 9 June 2021

Joint-Conditional Entropy and Mutual Information Estimation Involving Three Random Variables and asymptotic Normality | Chapter 2 | Theory and Practice of Mathematics and Computer Science Vol. 11

 The joint probability mass function of a triplet of discrete random variables is estimated using a method described in this paper. The joint conditional entropies and mutual information estimations involving three random variables are calculated using this estimator. From there, almost certain convergence rates and asymptotic normalcy may be determined. Simulations are used to verify the theoretical results.

Author(s) Details

Amadou Diadie Ba
LERSTAD, Gaston Berger University, Saint-Louis, Sénégal.

Gane Samb Lo
LERSTAD, Gaston Berger University, Saint-Louis, Sénégal and LSTA, Pierre et Marie Curie University, Paris VI, France and AUST-African University of Sciences and Technology, Abuja, Nigeria and Elected Member of the International Statistical Institute (ISI), Netherlands.

View Book:- https://stm.bookpi.org/TPMCS-V11/article/view/1307

No comments:

Post a Comment