Joint-entropy Definition

noun

(information theory) The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.

If random variables and are mutually independent, then their joint entropy is just the sum of its component entropies. If they are not mutually independent, then their joint entropy will be where is the mutual information of and .
Wiktionary

Other Word Forms of Joint-entropy

Noun

Singular:
joint-entropy
Plural:
joint entropies

Find Similar Words

Find similar words to joint-entropy using the buttons below.

Words Starting With

Words Ending With

Unscrambles

joint-entropy