Shannon-entropy Definition

noun
Information entropy.
Shannon entropy H is given by the formula where pi is the probability of character number i showing up in a stream of characters of the given "script".
Consider a simple digital circuit which has a two-bit input (X, Y) and a two-bit output (X and Y, X or Y). Assuming that the two input bits X and Y have mutually independent chances of 50% of being HIGH, then the input combinations (0,0), (0,1), (1,0), and (1,1) each have a 1/4 chance of occurring, so the circuit's Shannon entropy on the input side is . Then the possible output combinations are (0,0), (0,1), and (1,1) with respective chances of 1/4, 1/2, and 1/4 of occurring, so the circuit's Shannon entropy on the output side is , so the circuit reduces ("orders") the information going through it by half a bit of Shannon entropy due to its logical irreversibility.
Wiktionary

Other Word Forms of Shannon-entropy

Noun

Singular:
shannon-entropy
Plural:
shannon-entropies

Origin of Shannon-entropy

  • Named after Claude Shannon, the "father of information theory".

    From Wiktionary

Find Similar Words

Find similar words to shannon-entropy using the buttons below.

Words Starting With

Words Ending With

Unscrambles

shannon-entropy