(This post represents research in progress. I may think about these concepts
entirely differently a few months from now, but for my own benefit I'm trying to
exposit on them in
For collections of independent random variables, the Chernoff bound and related
bounds give us very sharp concentration inequalities --- if $X_1,\ldots,X_n$ are
independent, then their sum has a distribution
I spent the last several hours trying to come up with an efficient algorithm to
the following problem:
Problem:Suppose that we have a sequence of $l$ pairs of non-negative numbers
$(a_1,
While grading homeworks today, I came across the following bound:
Theorem 1: If A and B are symmetric $n\times n$ matrices with eigenvalues
$\lambda_1 \geq \lambda_2 \geq \ldots \geq \lambda_
The KL divergence is an important tool for studying the distance between two
probability distributions. Formally, given two distributions $p$ and $q$, the KL
divergence is defined as
$KL(p || q) := \int p(