# Shannon entropy

Shannon entropy

**MLA Style**

"Shannon entropy." YourDictionary, n.d. Web. 17 July 2019. <https://www.yourdictionary.com/shannon-entropy>.

**APA Style**

Shannon entropy. (n.d.). Retrieved July 17th, 2019, from https://www.yourdictionary.com/shannon-entropy

(*countable and uncountable*, *plural* Shannon entropies)

- information entropy
- Shannon entropy
*H*is given by the formula where*p*_{i}is the probability of character number*i*showing up in a stream of characters of the given "script". - Consider a simple digital circuit which has a two-bit input (
*X*,*Y*) and a two-bit output (*X*and*Y*,*X*or*Y*). Assuming that the two input bits*X*and*Y*have mutually independent chances of 50% of being HIGH, then the input combinations (0,0), (0,1), (1,0), and (1,1) each have a 1/4 chance of occurring, so the circuit's Shannon entropy on the input side is . Then the possible output combinations are (0,0), (0,1), and (1,1) with respective chances of 1/4, 1/2, and 1/4 of occurring, so the circuit's Shannon entropy on the output side is , so the circuit reduces ("orders") the information going through it by half a bit of Shannon entropy due to its logical irreversibility.

- Shannon entropy

Named after Claude Shannon, the "father of information theory".

**MLA Style**

"Shannon entropy." YourDictionary, n.d. Web. 17 July 2019. <https://www.yourdictionary.com/shannon-entropy>.

**APA Style**

Shannon entropy. (n.d.). Retrieved July 17th, 2019, from https://www.yourdictionary.com/shannon-entropy