Abstract :
[en] We introduce a new formalism for computing expectations of functionals of arbitrary random vectors, by using generalised integration by parts formulae. In doing so we extend recent representation formulae for the score function introduced in \cite{nourdin2013entropy} and also provide a new proof of a central identity first discovered in \cite{guo2005mutual}. We derive a representation for the standardised Fisher information of sums of i.i.d. random vectors which {we} use to provide rates of convergence in information theoretic central limit theorems (both in Fisher information distance and in relative entropy) {and a Stein bound for Fisher information distance.}
Scopus citations®
without self-citations
4