Definitions for Entropyˈɛn trə pi

This page provides all possible meanings and translations of the word Entropy

Random House Webster's College Dictionary

en•tro•pyˈɛn trə pi(n.)

  1. a function of thermodynamic variables, as temperature or pressure, that is a measure of the energy that is not available for work in a thermodynamic process.

    Category: Thermodynamics

    Ref: Symbol: S 2

  2. (in data transmission and information theory) a measure of the loss of information in a transmitted signal.

    Category: Thermodynamics, Computers

  3. (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature.

    Category: Thermodynamics

  4. a state of disorder, as in a social system, or a hypothetical tendency toward such a state.

    Category: Sociology

Origin of entropy:

< G Entropie (1865); see en -2, -tropy

en•tro′pi•cal•ly(adv.)

Princeton's WordNet

  1. information, selective information, entropy(noun)

    (communication theory) a numerical measure of the uncertainty of an outcome

    "the signal contained thousands of bits of information"

  2. randomness, entropy, S(noun)

    (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work

    "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity"

Wiktionary

  1. entropy(Noun)

    A measure of the amount of information and noise present in a signal. Originally a tongue in cheek coinage, has fallen into disuse to avoid confusion with thermodynamic entropy.

  2. entropy(Noun)

    The tendency of a system that is left to itself to descend into chaos.

  3. Origin: First attested in 1868. From German Entropie, coined in 1865 by Rudolph Clausius, from ἐντροπία, from ἐν + τροπή.

Webster Dictionary

  1. Entropy(noun)

    a certain property of a body, expressed as a measurable quantity, such that when there is no communication of heat the quantity remains constant, but when heat enters or leaves the body the quantity increases or diminishes. If a small amount, h, of heat enters the body when its temperature is t in the thermodynamic scale the entropy of the body is increased by h / t. The entropy is regarded as measured from some standard temperature and pressure. Sometimes called the thermodynamic function

Freebase

  1. Entropy

    Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy. Entropy is a thermodynamic quantity that helps to account for the flow of energy through a thermodynamic process. Entropy was originally defined for a thermodynamically reversible process as where the entropy is found from the uniform thermodynamic temperature of a closed system divided into an incremental reversible transfer of heat into that system. The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic picture of the contents of a system. In thermodynamics, entropy has been found to be more generally useful and it has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state. Entropy is an extensive property, but it is often given as an intensive property of specific entropy as entropy per unit mass or entropy per mole. In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. The role of thermodynamic entropy in various thermodynamic processes can thus be understood by understanding how and why that information changes as the system evolves from its initial condition. It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it. The second law is now often seen as an expression of the fundamental postulate of statistical mechanics via the modern definition of entropy.

U.S. National Library of Medicine

  1. Entropy

    The measure of that part of the heat or energy of a system which is not available to perform work. Entropy increases in all natural (spontaneous and irreversible) processes. (From Dorland, 28th ed)

The Standard Electrical Dictionary

  1. Entropy

    Non-available energy. As energy may in some way or other be generally reduced to heat, it will be found that the equalizing of temperature, actual and potential, in a system, while it leaves the total energy unchanged, makes it all unavailable, because all work represents a fall in degree of energy or a fall in temperature. But in a system such as described no such fall could occur, therefore no work could be done. The universe is obviously tending in that direction. On the earth the exhaustion of coal is in the direction of degradation of its high potential energy, so that the entropy of the universe tends to zero. (See Energy, Degradation of.) [Transcriber's note: Entropy (disorder) INCREASES, while AVAILABLE ENERGY tends to zero.]

Translation

Find a translation for the Entropy definition in other languages:

Select another language:

Discuss these Entropy definitions with the community:


Citation

Use the citation below to add this definition to your bibliography:

Style:MLAChicagoAPA

"Entropy." Definitions.net. STANDS4 LLC, 2014. Web. 25 Oct. 2014. <http://www.definitions.net/definition/Entropy>.

Are we missing a good definition for Entropy?


The Web's Largest Resource for

Definitions & Translations


A Member Of The STANDS4 Network


Nearby & related entries:

Alternative searches for Entropy: