For collections of independent random variables, the Chernoff bound and related
bounds give us very sharp concentration inequalities --- if $X_1,\ldots,X_n$ are
independent, then their sum has a distribution
I spent the last several hours trying to come up with an efficient algorithm to
the following problem:
Problem:Suppose that we have a sequence of $l$ pairs of non-negative numbers
$(a_1,
While grading homeworks today, I came across the following bound:
Theorem 1: If A and B are symmetric $n\times n$ matrices with eigenvalues
$\lambda_1 \geq \lambda_2 \geq \ldots \geq \lambda_
The KL divergence is an important tool for studying the distance between two
probability distributions. Formally, given two distributions $p$ and $q$, the KL
divergence is defined as
$KL(p || q) := \int p(
Today Arun asked me the following question:
"Under what conditions will a set $\{p_1,\ldots,p_n\}$ of polynomials be
quadratically independent, in the sense that $\{p_1^2, p_1p_