How to estimate the mutual information numerically?
Suppose I have a sample {$z_i$}$_{i\in[0,N]}$ = {($x_i,y_i$)}$_{i\in[0,N]}$ which commes from a probability distribution $p_z(z)$. How can I use it to estimate the mutual information between X and Y ?
$MI(X,Y) = \int_Y \int_X p_z(x,y) \log{ \left(\frac{p_z(x,y)}{p_x(x)\,p_y(y)} \right) }$
where $p_x$ and $p_y$ are the marginal distributions of X and Y:
$p_x(x) = \int_Yp_z(x,y)$
$p_y(y) = \int_Xp_z(x,y)$.
Topic estimators distribution mutual-information information-theory numerical
Category Data Science