Local KL Divergence

The KL divergence is an important tool for studying the distance between two probability distributions. Formally, given two distributions $p$ and $q$, the KL divergence is defined as $KL(p || q) := \int p(

Exponential Families

In my last post [http://jsteinhardt.wordpress.com/2012/12/06/log-linear-models/] I discussed log-linear models. In this post I'd like to take another perspective on log-linear models, by thinking of

Log-Linear Models

I've spent most of my research career trying to build big, complex nonparametric models [http://jmlr.csail.mit.edu/proceedings/papers/v22/steinhardt12/steinhardt12.pdf] ; however, I've more recently