- Entropy is defined as a state of disorder or decline into disorder.
An example of entropy is a stock market that is in chaos and that makes no sense and isn't predictable.

## entropy

noun

- a thermodynamic measure of the amount of energy unavailable for useful work in a system undergoing change
- a measure of the degree of disorder in a substance or a system: entropy always increases and available energy diminishes in a closed system, as the universe
- in information theory, a measure of the information content of a message evaluated as to its uncertainty
- a process of degeneration marked variously by increasing degrees of uncertainty, disorder, fragmentation, chaos, etc.; specif., such a process regarded as the inevitable, terminal stage in the life of a social system or structure

Origin of entropy

German*entropie*, arbitrary use (by R. J. E.

*Clausius*, 1822-88, German physicist) of Classical Greek

*entropē*, a turning toward, as if ; from German

*en(ergie)*, energy + Classical Greek

*tropē*, a turning: see trope

*Related Forms:*

- entropic
adjective

## entropy

noun

*pl.*

**en·tro·pies**

*Symbol***S**For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.- A measure of the disorder or randomness in a closed system.
- A measure of the loss of information in a transmitted message.
- The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity.
- Inevitable and steady deterioration of a system or society.

Origin of entropy

German*Entropie*: Greek

*en-*,

*in*; see

**en–**

^{2}+ Greek

*tropē*,

*transformation*; see

*trep-*in Indo-European roots.

*Related Forms:*

**en·tro′pic**adjective

**en·tro′pi·cal·ly**adverb

## entropy

(*countable and uncountable*, *plural* entropies)

- (thermodynamics, countable)
- strictly thermodynamic entropy. A measure of the amount of energy in a physical system that cannot be used to do work.
*The thermodynamic free energy is the amount of work that a thermodynamic system can perform; it is the internal energy of a system minus the amount of energy that cannot be used to perform work. That unusable energy is given by the entropy of a system multiplied by the temperature of the system.*^{[1]}*(Note that, for both Gibbs and Helmholtz free energies, temperature is assumed to be fixed, so entropy is effectively directly proportional to useless energy.)*

- A measure of the disorder present in a system.
*Ludwig Boltzmann defined entropy as being directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate (with the eponymous constant of proportionality). Assuming (by the fundamental postulate of statistical mechanics), that all microstates are equally probable, this means, on the one hand, that macrostates with higher entropy are more probable, and on the other hand, that for such macrostates, the quantity of information required to describe a particular one of its microstates will be higher. That is, the Shannon entropy of a macrostate would be directly proportional to the logarithm of the number of equivalent microstates (making it up). In other words, thermodynamic and informational entropies are rather compatible, which shouldn't be surprising since Claude Shannon derived the notation 'H' for information entropy from Boltzmann's H-theorem.*

- The capacity factor for thermal energy that is hidden with respect to temperature [2].
- The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature. [3]

- strictly thermodynamic entropy. A measure of the amount of energy in a physical system that cannot be used to do work.
- (statistics, information theory, countable) A measure of the amount of information and noise present in a signal. Originally a tongue-in-cheek coinage, has fallen into disuse to avoid confusion with thermodynamic entropy.
- (uncountable) The tendency of a system that is left to itself to descend into chaos.

First attested in 1868. From German *Entropie*, coined in 1865 by Rudolph Clausius, from Ancient Greek *ἐντροπία* (entropia, “a turning towards”), from *ἐν* (en, “in”) + *τροπή* (tropē, “a turning”).

## entropy - Computer Definition

- In physics, and particularly in the area of thermodynamics, a measure of the amount of energy unavailable to do work in a closed system.
- The degradation of the matter and energy in the universe to the point of inert uniformity.The dispersal of energy.
- In information theory, a measure of the content of a message evaluated with respect to its probability of occurrence, or uncertainty of occurrence, depending on your perspective.
- In communications, a measure of the randomness of signal noise occurring in transmission.

Disorder or randomness. In data compression, it is a measure of the amount of non-redundant and non-compressible data in an object (the amount that is not similar). In encryption, it is the amount of disorder or randomness that is added. In software, it is the disorder and jumble of its logic, which occurs after the program has been modified over and over. See encryption algorithm.