samedi 25 mars 2017

Importance of a Random Variable using Entropy or other method

I have a two-dimensional random vector $\mathbf{x} = [x_1, x_2]^T$ with a known joint probability density function (pdf). The pdf is non-Gaussian and the two entries of the random vector are statistically dependent. I need to show that for example $x_1$ is more important than $x_2$, in terms of the amount of information that it carries. Is there a classical solution for this problem? Can I show that for example n% of the total information carried by $\mathbf{x}$ is in $x_1$ and 100-n% is carried by $x_2$?

I assume that the standard way of measuring the amount of information is by calculating the Entropy. Any clues?




Aucun commentaire:

Enregistrer un commentaire