information, selective information, entropy(noun)
(communication theory) a numerical measure of the uncertainty of an outcome
"the signal contained thousands of bits of information"
randomness, entropy, S(noun)
(thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work
"entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity"
A measure of the amount of information and noise present in a signal. Originally a tongue in cheek coinage, has fallen into disuse to avoid confusion with thermodynamic entropy.
The tendency of a system that is left to itself to descend into chaos.
Origin: First attested in 1868. From German Entropie, coined in 1865 by Rudolph Clausius, from ἐντροπία, from ἐν + τροπή.
a certain property of a body, expressed as a measurable quantity, such that when there is no communication of heat the quantity remains constant, but when heat enters or leaves the body the quantity increases or diminishes. If a small amount, h, of heat enters the body when its temperature is t in the thermodynamic scale the entropy of the body is increased by h / t. The entropy is regarded as measured from some standard temperature and pressure. Sometimes called the thermodynamic function
Origin: [Gr. a turning in; in + a turn, fr. to turn.]
Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy. Entropy is a thermodynamic quantity that helps to account for the flow of energy through a thermodynamic process. Entropy was originally defined for a thermodynamically reversible process as where the entropy is found from the uniform thermodynamic temperature of a closed system divided into an incremental reversible transfer of heat into that system. The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic picture of the contents of a system. In thermodynamics, entropy has been found to be more generally useful and it has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state. Entropy is an extensive property, but it is often given as an intensive property of specific entropy as entropy per unit mass or entropy per mole. In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. The role of thermodynamic entropy in various thermodynamic processes can thus be understood by understanding how and why that information changes as the system evolves from its initial condition. It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it. The second law is now often seen as an expression of the fundamental postulate of statistical mechanics via the modern definition of entropy.
Chambers 20th Century Dictionary
en′trop-i, n. a term in physics signifying 'the available energy.'
U.S. National Library of Medicine
The measure of that part of the heat or energy of a system which is not available to perform work. Entropy increases in all natural (spontaneous and irreversible) processes. (From Dorland, 28th ed)
The Standard Electrical Dictionary
Non-available energy. As energy may in some way or other be generally reduced to heat, it will be found that the equalizing of temperature, actual and potential, in a system, while it leaves the total energy unchanged, makes it all unavailable, because all work represents a fall in degree of energy or a fall in temperature. But in a system such as described no such fall could occur, therefore no work could be done. The universe is obviously tending in that direction. On the earth the exhaustion of coal is in the direction of degradation of its high potential energy, so that the entropy of the universe tends to zero. (See Energy, Degradation of.) [Transcriber's note: Entropy (disorder) INCREASES, while AVAILABLE ENERGY tends to zero.]
The numerical value of Entropy in Chaldean Numerology is: 5
The numerical value of Entropy in Pythagorean Numerology is: 5
Images & Illustrations of Entropy
Translations for Entropy
From our Multilingual Translation Dictionary
- غير قادر عليArabic
- آنتروپی, واحد اندازهگیری ترمودینامیکPersian
- entropia, entropyLatin
Get even more translations for Entropy »
Find a translation for the Entropy definition in other languages:
Select another language: