# Entropy vs. Enthalpy

Main DifferenceThermodynamics is the branch of physics that deals with the study of heat and other relating phenomena. This also deals with the relations of heat with other forms of energy such as electrical, mechanical or chemical energy. Entropy and Enthalpy are the famous terms related to thermodynamics. Entropy is the measurement of the disorder or the randomness in the system during the chemical process, whereas enthalpy measures the heat change or internal energy change of a system during the chemical reaction under constant pressure. Actually, enthalpy is the measure of total energy in the system, although it always denotes the change in the energy system at constant pressure as it as the total enthalpy of the system can’t be measured. It is measured in joules per kilogram. On the other hand, entropy is the measure of random activity, which is usually the amount of disorder in the system. The SI unit for Entropy (S) is Joules per Kelvin (J/K).

What is Entropy?Entropy is the measurement of the disorder or the randomness in the system during the chemical process. Actually, it is the measure of random activity, which is usually the amount of disorder in the system. As we know that energy in a body tells about its capability to do work, the energy in a body can be of any type. It can be mechanical, chemical, thermal, nuclear or any other energy. The measure of that change of energy or disorder during the chemical process is entropy. 'S’ denotes entropy, and it is always written as a capital letter. In an equation, it is written as ‘ΔS’ as it represents the disorder in the entropy during the chemical process. The SI unit for Entropy (S) is Joules per Kelvin (J/K). The temperature in the entropy equation is measured on the absolute or Kelvin temperature scale.

## Difference Between Entropy and Enthalpy

#### Entropynoun

strictly thermodynamic entropy. A measure of the amount of energy in a physical system that cannot be used to do work.

*The thermodynamic free energy is the amount of work that a thermodynamic system can perform; it is the internal energy of a system minus the amount of energy that cannot be used to perform work. That unusable energy is given by the entropy of a system multiplied by the temperature of the system.*

^{[http://en.wikipedia.org/wiki/Thermodynamic_free_energy]}(Note that, for both Gibbs and Helmholtz free energies, temperature is assumed to be fixed, so entropy is effectively directly proportional to useless energy.)#### Enthalpynoun

In thermodynamics, a measure of the heat content of a chemical or physical system.

*$H\; =\; U\; +\; p\; V$, where H is enthalpy, U is internal energy, p is pressure, and V is volume.*

#### Entropynoun

A measure of the disorder present in a system.

*Ludwig Boltzmann defined entropy as being directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate (with the eponymous constant of proportionality). Assuming (by the fundamental postulate of statistical mechanics), that all microstates are equally probable, this means, on the one hand, that macrostates with higher entropy are more probable, and on the other hand, that for such macrostates, the quantity of information required to describe a particular one of its microstates will be higher. That is, the Shannon entropy of a macrostate would be directly proportional to the logarithm of the number of equivalent microstates (making it up). In other words, thermodynamic and informational entropies are rather compatible, which shouldn't be surprising since Claude Shannon derived the notation 'H' for information entropy from Boltzmann's H-theorem.*

#### Enthalpynoun

(thermodynamics) a thermodynamic quantity equal to the internal energy of a system plus the product of its volume and pressure;

*enthalpy is the amount of energy in a system capable of doing mechanical work*

#### Entropynoun

The capacity factor for thermal energy that is hidden with respect to temperature [http://arxiv.org/pdf/physics/0004055].

#### Entropynoun

The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature. [https://web.archive.org/web/20060702234316/http://www.entropysite.com/students_approach.html]

#### Entropynoun

A measure of the amount of information and noise present in a signal.

#### Entropynoun

(uncountable) The tendency of a system that is left to itself to descend into chaos.

#### Entropynoun

(communication theory) a numerical measure of the uncertainty of an outcome;

*the signal contained thousands of bits of information*

#### Entropynoun

(thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work;

*entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity*

## Comparison Chart

Entropy | Enthalpy |

Entropy is the measurement of the disorder or the randomness in the system during the chemical process. | Enthalpy measures the heat change or internal energy change of a system during the chemical reaction under constant pressure. |

SI unit | |

Unit joule per kelvin (J/K) | The SI unit for specific enthalpy is joule per kilogram. |

Formula | |

S=q/T | H=U+PV |

**What is Enthalpy?**

Enthalpy measures the heat change or internal energy change of a system during the chemical reaction under constant pressure. Enthalpy is the measure of total energy in the system, although it always denotes the change in the energy system at constant pressure as it as the total enthalpy of the system can’t be measured. Enthalpy is denoted as ‘ΔH,.’ It tells about the difference between the enthalpy of the products and the enthalpy of the reactants. It is measured in the units of Jmol^{-1}. The word enthalpy is derived from the Greek word ‘enthalpos,’ meaning ‘to put heat into’. The word ‘enthalpy’ for this measure was coined by a 19^{th}-century famous physicist Heike Kamerlingh Onnes. It is generally measured in joules per kilogram.

**Entropy vs. Enthalpy**

- Entropy is the measurement of the disorder or the randomness in the system during the chemical process, whereas enthalpy measures the heat change or internal energy change of a system during the chemical reaction under constant pressure.
- Actually, enthalpy is the measure of total energy in the system, whereas entropy is the measure of random activity, which is usually the amount of disorder in the system.
- The word ‘enthalpy’ for this measure was coined by a 19
^{th}century famous physicist Heike Kamerlingh Onnes, on the other hand, German physicist Rudolf Clausius coined the word ‘entropy.’

**Explanatory Video**