Conditional-entropy meaning

(information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.

The conditional entropy of random variable given (i.e., conditioned by), denoted as , is equal to where is the mutual information between and .

noun
0
0
Advertisement