最終更新日:2022/12/24
Mutual information I(X;Y) between two random variables X and Y is what is left over when their mutual conditional entropies H(Y|X) and H(X|Y) are subtracted from their joint entropy H(X,Y). It can be given by the formula I(X;Y)=-∑ₓ∑_yp_X,Y(x,y) log _bp_X,Y(x,y)/p_X|Y(x|y)p_Y|X(y|x).
音声機能が動作しない場合はこちらをご確認ください
編集履歴(0)