[en] We introduce a new formalism for computing expectations of functionals of arbitrary random vectors, by using generalised integration by parts formulae. In doing so we extend recent representation formulae for the score function introduced in \cite{nourdin2013entropy} and also provide a new proof of a central identity first discovered in \cite{guo2005mutual}. We derive a representation for the standardised Fisher information of sums of i.i.d. random vectors which {we} use to provide rates of convergence in information theoretic central limit theorems (both in Fisher information distance and in relative entropy) {and a Stein bound for Fisher information distance.}
Disciplines :
Mathematics
Author, co-author :
Nourdin, Ivan
Peccati, Giovanni
Swan, Yvik ; Université de Liège > Département de mathématique > Probabilités et statistique mathématique
Language :
English
Title :
Integration by parts and representation of information functionals
Publication date :
2014
Event name :
IEEE International symposium on Information Theory
Event place :
United States - Hawaii
Event date :
July 2014
Audience :
International
Journal title :
IEEE International Symposium on Information Theory Proceedings
scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.
Bibliography
H. Airault, P. Malliavin, and F. Viens. Stokes formula on the Wiener space and n-dimensional Nourdin-Peccati analysis. Journal of Functional Analysis, 258(5):1763-1783, 2010.
K. Ball, F. Barthe, and A. Naor. Entropy jumps in the presence of a spectral gap. Duke Math. J., 119(1):41-63, 2003.
K. Ball and V. Nguyen. Entropy jumps for random vectors with log-concave density and spectral gap. Preprint, arxiv:1206.5098v3, 2012.
A.S. Barbour, O. Johnson, I. Kontoyiannis, and M. Madiman. Compound Poisson approximation via information functionals Electron. J. Probab, 15(42): 1344-1368, 2010.
T. Cacoullos and V. Papathanasiou (1989): Characterizations of distributions by variance bounds. Statist. Probab. Letters 7, 351-356.
L. H. Y. Chen, L. Goldstein, and Q.-M. Shao. Normal approximation by Stein's method. Probability and its Applications (New York). Springer, Heidelberg, 2011.
D. Guo, S. Shamai, and S. Verdu. Mutual information and minimum mean-square error in gaussian channels. Information Theory, IEEE Transactions on, 51(4):1261- 1282, 2005.
O. Johnson. Information theory and the central limit theorem. Imperial College Press, London, 2004.
O. Johnson and A. Barron. Fisher information inequalities and the central limit theorem. Probab. Theory Related Fields, 129(3):391-409, 2004.
O. Johnson and Y. Suhov. Entropy and random vectors. J. Statist. Phys., 104(1-2):145-192, 2001.
A. Kagan. A multivariate analog of the Cramer theorem on components of the Gaussian distributions. In Stability problems for stochastic models, pages 68-77. Springer, 1989.
M. Ledoux, I. Nourdin and G. Peccati. Stein's method, logarithmic Sobolev and transport inequalities. arXiv preprint arXiv:1403.5855, 2014.
E. L. Lehmann and G. Casella Theory of point estimation Springer Texts in Statistics. Springer-Verlag, New York, second edition, 1998.
C. Ley and Y. Swan. Local Pinsker inequalities via Stein's discrete density approach. IEEE Trans. Info. Theory, 59(9):5584-4491, 2013.
C. Ley and Y. Swan. Stein's density approach and information inequalities. Electron. Comm. Probab., 18(7):1-14, 2013.
M. Madiman and A. Barron. Generalized entropy power inequalities and monotonicity properties of information. IEEE Transactions on Information Theory 53(7):2317- 2329, 2007.
I. Nourdin and G. Peccati. Normal approximations with Malliavin calculus : from Stein's method to universality. Cambridge Tracts in Mathematics. Cambridge University Press, 2012.
I. Nourdin, G. Peccati and A. Reveillac. Multivariate normal approximation using Stein's method and Malliavin calculus. Ann. I.H.P. Proba. Stat., 46(1):45-58, 2010.
I. Nourdin, G. Peccati, and Y. Swan. Entropy and the fourth moment phenomenon. Journal of Functional Analysis, 266, 3170-3207, 2013.
I. Sason. On the entropy of sums of Bernoulli random variables via the Chen-Stein method. Information Theory Workshop (ITW) IEEE, 542-546, 2012.
Similar publications
Sorry the service is unavailable at the moment. Please try again later.
This website uses cookies to improve user experience. Read more
Save & Close
Accept all
Decline all
Show detailsHide details
Cookie declaration
About cookies
Strictly necessary
Performance
Strictly necessary cookies allow core website functionality such as user login and account management. The website cannot be used properly without strictly necessary cookies.
This cookie is used by Cookie-Script.com service to remember visitor cookie consent preferences. It is necessary for Cookie-Script.com cookie banner to work properly.
Performance cookies are used to see how visitors use the website, eg. analytics cookies. Those cookies cannot be used to directly identify a certain visitor.
Used to store the attribution information, the referrer initially used to visit the website
Cookies are small text files that are placed on your computer by websites that you visit. Websites use cookies to help users navigate efficiently and perform certain functions. Cookies that are required for the website to operate properly are allowed to be set without your permission. All other cookies need to be approved before they can be set in the browser.
You can change your consent to cookie usage at any time on our Privacy Policy page.