|
|
Subject:
calculating "conditional mutual information"
Category: Science > Math Asked by: qhan-ga List Price: $5.00 |
Posted:
13 Jun 2006 23:47 PDT
Expires: 13 Jul 2006 23:47 PDT Question ID: 737997 |
Please tell me how I can calculate "conditional mutual information". Please provide simple examples (with values) to illustrate your explanation. Please provide any links that explains this with other examples. (I have already visited a number of sites which tells me the formula, but does not provide any examples I can follow with) |
|
There is no answer at this time. |
|
Subject:
Re: calculating "conditional mutual information"
From: cooljnm-ga on 27 Jul 2006 14:14 PDT |
MI(X;Y|Z)=H(XYZ)-H(XZ)-H(YZ)+H(Z) This is the formula to calculate conditional mutual information. Explanation of the formula: Here I am calculating mutual information between X and Y when Z is given (MI(X;Y|Z)). H is the entropy. Conditional mutual information is mutual information between two variables when the value of theird variable is known. Example : Let Z=X XOR Y and values of X & Y are independent. MI(X;Y)=0 As X & Y are independent MI is 0. In other words knowing X does not provide any information about Y (vice versa). MI(X;Y|Z)=finite value depending upon the the distribution of X & Y Here MI is a finite values. The reason is When we already have the knowledge of Z, knowing value of X provides complete information about the value of Y Concept of conitional mutual information is easier to follow if you know the concept of conditional entropy. Visit wikipedia or any other site for this information. If you need I can provide equations for the particular problem you are interested with proper explanation. Hope this helps :) |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |