JEE Exam » JEE Study Material » Chemistry » Entropy and Thermodynamics

Entropy and Thermodynamics

Read this detailed article defining entropy and thermodynamics, their meaning, examples, discovery, laws of thermodynamics, and entropy and heat death.

Entropy and thermodynamics measure the volume of disorder in a physical or a biological system. It is a concept of science and physical property that can be measured and has a common connection or association with the phenomenon of uncertainty, disorder, or randomness. This term and the concept are used in various fields, from classical thermodynamics to the description of nature microscopically in statistical physics and the principles of information theory. It has multiple uses in chemistry, physics, biological systems and their direct relation to the living and life, in cosmology, economics, sociology, weather science, climate change, and information systems, including the transmission of information in telecommunication. If the level of entropy is too high in a system, we cannot have adequate information about that system. So the information is a form of negative entropy.

Discovery of entropy and thermodynamics 

In 1850, thermodynamics was referred to as ‘thermodynamic function’ and ‘heat-potential’ by Macquorn Rankine, a Scottish scientist and engineer. Then in 1865, a German physicist called Rudolf Clausius, who was one of the founders of the field of thermodynamics, defined it as being the quotient of a minute amount of heat to instant temperature. 

Then an Austrian physicist called Ludwig Boltzmann explained the entropy and thermodynamics meaning. According to him, entropy is the measure of several microscopic arrangements or individual states of atoms and molecules that comply with the system’s macroscopic condition. He thus presented the concept of statistical disorder and probability distributions in a new field of thermodynamics called statistical mechanics. He found the connection between microscopic interactions that keep fluctuating around an average configuration in the form of a simple logarithmic law with a proportionality constant called the Boltzmann constant. It is one of the standards and defining universal constants for the modern International System of Units(SI).

In 1948, a scientist called Claude Shannon came forward with the same kind of concepts of statistics for measuring the uncertainty and multiplicity of the problem of random losses of information in the signals of telecommunication. Taking John von Neumann’s suggestion, Shannon named this entity of missing information in statistical mechanics ‘entropy’, giving birth to the field of information theory. Moreover, this definition is recognized as a universal definition of the concept of entropy.

Entropy has some results, out of which some processes are unchangeable and irreversible. The concept of entropy is prominent in the second law of thermodynamics. The entropy of isolated systems that are left to evolve spontaneously cannot decrease with time as they always reach a state of thermodynamic equilibrium where entropy is highest.

According to Alfred Wehrl, entropy makes a relation between macroscopic and microscopic aspects of nature to know about the behaviour of macroscopic systems or real matter in equilibrium or close to equilibrium. Entropy is a measure of the quantity of chaos in a microscopic system. The concept of entropy was first brought up in the field of thermodynamics. Moreover, thermodynamics is the study of energy. It is the ability to do work and the conversion between various forms of energy like the internal energy of a system, heat and work. The laws of thermodynamics are taken from statistical mechanics. There are three laws of thermodynamics. According to the first law, energy can neither be created nor destroyed, but it can only be changed from one form to another. There is a formulation of this first law, which states that the amount of heat flowing in a system is equal to the sum of the change of the internal energy and the work done by the system.

According to the second law, it is impossible to create or form a process that has the unique effect of taking out positive heat from a reservoir and converting it into positive work. It has a formulation for this, which states that a closed system’s entropy never decreases whatever the method is occurring in the system. This law shows that none of the heat engines can have and manage a hundred percent efficiency. The second law of thermodynamics says that the complete entropy of a closed system cannot decrease. However, in a system, the entropy of one system can decrease by increasing the entropy of another system.

According to the third law of thermodynamics, the entropy of a system at zero absolute temperature is a defined and stable constant. It is also because most of the systems at zero temperature are present in their ground states, and the entropy is known by the degeneracy of the ground state.

Entropy and heat death of the universe

According to some scientists, the entropy of the universe will one day increase to such a point where the randomness and imbalance will form a system that will not be able to do productive work. The universe would be considered dead by heat death when only thermal energy will remain. However, some other scientists do not agree with the theory of heat death. Some say the universe system moves away from entropy even as areas in it increase in entropy. According to some others, the universe is a part of a more extensive system.

Entropy and thermodynamics examples

An ice block will increase in entropy if it melts. Moreover, it is easy to see the disorder of the system get more. Ice is made of water molecules that are connected in a crystal lattice. When ice melts, the molecules get more energy, and they spread apart further and lose the state to form a liquid one. In the same way, the change of phase from liquid to gas, like from water to steam, leads to the system getting more energy. The system’s energy can decrease too when steam changes its phase to form water or when water transforms into ice. The thermodynamic second law is not affected because the matter does not belong to a contained system. The system’s entropy can decrease, but it increases that of the environment. 

Entropy and its relation to time

Entropy is often called the ‘arrow of time’ because the matter in isolated systems moves from order to disorder.

Entropy is the ‘state of disorder’ in a system in simple terms. It makes up the second and third law of thermodynamics, which is about the universal changes in entropy according to the surroundings and the system and substantial entropy, respectively.

Microstates

According to dictionaries, ‘macro’ means large and ‘micro’ means small, but a macrostate and a microstate do not define huge and small sizes in chemical systems in thermodynamics. Alternatively, they view the system in two very different methods. A microstate comprises one of the large numbers of various arrangements that are accessible to the motional energy of molecules in relation to a specific macrostate.

Entropy of statistics

Entropy makes a state function called the system’s disordered state. In qualitative terms, entropy is a simple measure of the atomic and molecular energy that can expand out in a process and is definable in terms of the probability of statistics in a system or other quantities of thermodynamics.

The entropy of statistics in relation to mass, energy, and freedom

The energy or the substance of a portion of the universe can be more or less only if there occurs a simultaneous gain or loss of the portion of the universe in some other part. The universe’s freedom in that specific part can increase, but there will be no difference in the rest of the universe’s freedom. There can be a decrease in the rest of the universe’s freedom, but ultimately the total of the gain and loss should make up a net increase.

Conclusion

Entropy and thermodynamics are prominent and important concepts in physics and chemistry and are applicable to other disciplines like cosmology and economics. In physics, it comprises the part of thermodynamics, and in chemistry, it is a central concept in physical chemistry. In simple terms, it is a measure of the disorder and randomness of a system. It is a property of a thermodynamic system meaning that its value changes depending on the amount of present matter. The value of entropy is dependent on the mass of a system. It is denoted by the letter S and comprises units of joules per kelvin. Entropy can have a positive or a negative value. According to the second law of thermodynamics, the entropy of a system can decrease only if the entropy of another system increases. A highly ordered system has low entropy.

faq

Frequently Asked Questions

Get answers to the most common queries related to the JEE Examination Preparation.

What is entropy and thermodynamics?

Ans : Entropy and thermodynamics are a measure of the volume of disorder in a physical or a biologi...Read full

Where does entropy find its use?

Ans : Entropy has multiple uses in chemistry, physics, biological systems, and their direct relatio...Read full

Why is information a kind of negative entropy?

Ans : If the level of entropy is too high in a system, we cannot have adequate information about th...Read full

Who introduced the 'Boltzmann Constant'?

Ans : It was introduced by an Austrian physicist called Ludwig Boltzmann.

What is the example of entropy and thermodynamics?

Ans : An example of entropy and thermodynamics is that a block of ice will increase in entropy if i...Read full