On the Non-asymptotic and Sharp Lower Tail Bounds of Random Variables

21 Oct 2018  ·  Anru R. Zhang, Yuchen Zhou ·

The non-asymptotic tail bounds of random variables play crucial roles in probability, statistics, and machine learning. Despite much success in developing upper bounds on tail probability in literature, the lower bounds on tail probabilities are relatively fewer. In this paper, we introduce systematic and user-friendly schemes for developing non-asymptotic lower bounds of tail probabilities. In addition, we develop sharp lower tail bounds for the sum of independent sub-Gaussian and sub-exponential random variables, which match the classic Hoeffding-type and Bernstein-type concentration inequalities, respectively. We also provide non-asymptotic matching upper and lower tail bounds for a suite of distributions, including gamma, beta, (regular, weighted, and noncentral) chi-square, binomial, Poisson, Irwin-Hall, etc. We apply the result to establish the matching upper and lower bounds for extreme value expectation of the sum of independent sub-Gaussian and sub-exponential random variables. A statistical application of signal identification from sparse heterogeneous mixtures is finally considered.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here