A statement defining the theoretical maximum rate at which error-free digits can be transmitted over a finitely bandwidth-limited channel in the presence of Gaussian noise. Shannon's Law is mathematically expressed as C = W log 2 (1 + S/N), where *C * is the channel capacity in bits per second (bps), *W * is the bandwidth in Hertz, and *S/N * is the signal-to-noise ratio (SNR). Shannon's Law also is known as the *Shannon-Hartley theorem*, as Shannon developed the theorem in collaboration with R.V.L. Hartley, a colleague at Bell Labs. See also bandwidth, bps, channel, Gaussian noise, Hertz, law, SNR, and theory.

# Shannon's Law - Computer Definition

Used by arrangement with John Wiley & Sons, Inc.

**MLA Style**

"Shannon\'s Law." YourDictionary, n.d. Web. 14 July 2019. <https://www.yourdictionary.com/shannon-s-law>.

**APA Style**

Shannon\'s Law. (n.d.). Retrieved July 14th, 2019, from https://www.yourdictionary.com/shannon-s-law

A formula in the information theory of Claude Shannon (1916-2001) for determining the maximum, error-free rate of a digital communications channel. It is based on the channel's bandwidth and signal-to-noise ratio. See information theory and laws.
**Shannon's Law**
C = maximum data rate of channel
W = bandwidth of channel
S = signal-to-noise ratio
C = W log2(1 + S)

**MLA Style**

"Shannon\'s Law." YourDictionary, n.d. Web. 14 July 2019. <https://www.yourdictionary.com/shannon-s-law>.

**APA Style**

Shannon\'s Law. (n.d.). Retrieved July 14th, 2019, from https://www.yourdictionary.com/shannon-s-law