While grading homeworks today, I came across the following bound:
Theorem 1: If A and B are symmetric $n\times n$ matrices with eigenvalues
$\lambda_1 \geq \lambda_2 \geq \ldots \geq \lambda_
The KL divergence is an important tool for studying the distance between two
probability distributions. Formally, given two distributions $p$ and $q$, the KL
divergence is defined as
$KL(p || q) := \int p(
Today Arun asked me the following question:
"Under what conditions will a set $\{p_1,\ldots,p_n\}$ of polynomials be
quadratically independent, in the sense that $\{p_1^2, p_1p_
In my last post [http://jsteinhardt.wordpress.com/2012/12/06/log-linear-models/]
I discussed log-linear models. In this post I'd like to take another perspective
on log-linear models, by thinking of
I've decided to start recording algebra tricks as I end up using them. Today I
actually have two tricks, but they end up being used together a lot. I don'