An upper bound on the error induced by saddlepoint approximations - Applications to information theory - Inria - Institut national de recherche en sciences et technologies du numérique
Reports (Research Report) Year : 2020

An upper bound on the error induced by saddlepoint approximations - Applications to information theory

Abstract

This report introduces an upper bound on the absolute difference between: $(a)$~the cumulative distribution function (CDF) of the sum of a finite number of independent and identically distributed random variables; and $(b)$~a saddlepoint approximation of such CDF. % This upper bound, which is particularly precise in the regime of large deviations is used to study the dependence testing (DT) bound and the meta converse (MC) bound on the decoding error probability (DEP) in point-to-point memoryless channels. Often, these bounds cannot be analytically calculated and thus, lower and upper bounds become particularly useful. % Within this context, the main results include new upper bounds and lower bounds on the DT and MC bounds. % A numerical analysis of these bounds is presented in the case of the binary symmetric channel, the additive white Gaussian noise channel, and the additive symmetric $\alpha$-stable noise channel, in which the new bounds are observed to be tight.
Fichier principal
Vignette du fichier
submitted_version.pdf (2.05 Mo) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-02557887 , version 1 (29-04-2020)
hal-02557887 , version 2 (29-04-2020)
hal-02557887 , version 3 (12-05-2020)

Identifiers

  • HAL Id : hal-02557887 , version 3

Cite

Dadja Anade, Jean-Marie Gorce, Philippe Mary, Samir Perlaza. An upper bound on the error induced by saddlepoint approximations - Applications to information theory. [Research Report] RR-9329, INRIA Grenoble - Rhône-Alpes. 2020, pp.1-55. ⟨hal-02557887v3⟩
367 View
348 Download

Share

More