1% re = RelativeEntropy(p1, p2)
3% Returns the relative entropy (aka Kullback–Leibler
4% divergence) of two vectors.
10% p2 : vector, length M
16% The relative entropy calculated as
17% `re=\sum_{i=1}^M p1_i |\log(p1_i/p2_i)|`
19function re = RelativeEntropy (p1, p2)
24 re = re + p1(i)*abs(log(p1(i)/p2(i)));