KL Divergence
Published August 7, 2017
|
25 min
    Download
    Add to queue
    Copy URL
    Show notes
    Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution.  It comes to us originally from information theory, but today underpins other, more machine-learning-focused algorithms like t-SNE.  And boy oh boy can it be tough to explain.  But we're trying our hardest in this episode!
      15
      15
        0:00:00 / 0:00:00