Conditional-entropy Definition

noun
(information theory) The portion of a random variable's own Shannon entropy which is independent from another, given, random variable.
The conditional entropy of random variable given (i.e., conditioned by), denoted as , is equal to where is the mutual information between and .
Wiktionary

Other Word Forms of Conditional-entropy

Noun

Singular:
conditional-entropy
Plural:
conditional entropies

Find Similar Words

Find similar words to conditional-entropy using the buttons below.

Words Starting With

Words Ending With

Unscrambles

conditional-entropy

Word Length