In designing new thermodynamic systems, researchers broadly turn to the second law of thermodynamics to calculate how their performances can be optimised. Yet according to Dr Yousef Haseli at Central Michigan University, USA, existing conceptions of the second law which underlie these calculations are often misleading. Through a series of recent studies, Dr Haseli presents real examples of cases where minimising entropy generation doesn’t necessarily lead to the best possible performances. His results could have important implications for designs of modern technologies: including sources of clean, renewable energy.
Thermodynamics provides us with a robust picture of how physical systems are governed by the flow and exchange of heat, temperature, and mechanical work. While its first law demands that energy must always be conserved when converted into different forms, the second law is particularly central to our understanding of how energy is converted and, importantly, how systems involving these conversions can best be exploited in the latest technologies.
Dr Yousef Haseli from Central Michigan University explains: ‘the second law is founded on the observation that it is impossible to convert thermal energy to mechanical energy, without wasting a portion of the supplied thermal energy’. In other words, the efficiency of a heat engine can never be 100%.
In the 19th century, calculations involving the second law led physicists to conceive the idea of ‘entropy’: a quantity which increases with the amount of thermal energy carried by a system. When energy is transferred into heat within an isolated system, which doesn’t exchange energy with any other systems, the second law demands that the waste heat produced must increase the overall entropy of the universe.
This principle is now central to the design of engines, refrigerators, and heat exchangers, as well as studies of chemical reactions and their applications. Since this picture has become broadly accepted by physicists, the reduction of entropy production is now widely recognised as an important goal. But according to Haseli, we may now need to assess this idea more closely.
Erroneous assumptions about entropy generation
In systems with higher thermal energy, the movements and positions of constituent atoms and molecules are harder to describe mathematically. This means that under its current descriptions, entropy is often synonymous with words like randomness, disorder, and chaos. Alternatively, entropy can be described as a dispersal of energy through space, or through the more philosophical concept of an ‘arrow of time’ – where on macroscopic scales, physical systems wouldn’t appear the same over time, if the one-directional flow of time were somehow reversed.
Yet despite the central importance of these concepts in our understanding of thermal processes, Haseli argues that the diversity of explanations they provide have prevented physicists from agreeing on a clear definition. ‘Even today, entropy remains a grey area, and we may find a variety of interpretations’, he says. ‘The phenomenon of entropy production has often been regarded erroneously as a measure of “imperfectness” in a thermal process.’
Ultimately, none of these definitions are truly aligned with the initial description of entropy, as the transformational content of a body, which was set out by German physicist Rudolf Clausius in the 1850s. This description gave rise to the term entropy itself – deriving from the Greek entropē – literally ‘in transformation’. In simply equating an increase in entropy with an increase in the imperfection of a thermal process, researchers aiming to optimise thermodynamic systems have made it a central goal to minimise their production of entropy, so that the transfer of energy within the systems can become as reversible as possible.
Through three of his latest studies, Haseli has called into question whether this irreversibility should really be used to assess performances of thermodynamic processes in this way. He points out that in the latest studies based on the second law of thermodynamics, entropy production is very often viewed as a vital quantity and is frequently calculated without questioning why its calculation might be useful. For Haseli, a more sophisticated approach is needed.
Reassessing the need for entropy calculations
In his 2018 paper, Haseli discusses a theoretical construct named a ‘Carnot engine’. This idealised concept describes the transfer of heat between two infinitely large hot and cold reservoirs, and places an upper limit on the efficiency of any thermodynamic system that converts heat into work or, conversely, any refrigeration system that generates temperature differences by applying mechanical work to the system.
In this case, Haseli explores the applicability of this description in fuel cells, which produce electricity using controlled chemical reactions. By applying the relevant equations for the first and second laws of thermodynamics, his analysis has led to an important result: ‘The maximum efficiency of a fuel cell set by the second law is different than and may exceed that of a Carnot engine operating between the same low and high temperatures’, Haseli describes.
Previously, studies had suggested that the efficiency of an ideal fuel cell will decline at a higher temperature. In contrast, Haseli shows that, while the fuel cell’s maximum efficiency does start to decline with temperature at first, it then remains unaltered as temperature continues to increase – and then starts to rise. In this case, therefore, the type of Carnot engine considered in previous studies clearly isn’t suitable for assessing the ideal performance of fuel cells.
Misconceptions surrounding chemical equilibrium
Building on this analysis, Haseli next carried out a survey of previous studies which made a key assumption of reversible heat exchanges between chemical reactions and their surrounding environments. Under this assumption, maximum possible entropy will be produced when heat exchanges are completely irreversible. This description of ‘chemical equilibrium’ was first made by American physicist J Willard Gibbs in the 1900s.
Today, Gibbs’ criterion is still widely used to study the flow of energy in and around chemical systems. But, as Haseli points out, his theories incorporate the second law of thermodynamics without properly justifying its use. In many previous studies, this oversight has led to significant discrepancies between calculations and experimental results.
To illustrate this, Haseli focuses on a technique named ‘steam reforming’, where methane reacts with steam in the presence of a catalyst. This produces a mixture of products, including pure molecular hydrogen. ‘Methane steam reforming is used as an example to show that the composition at chemical equilibrium predicted by kinetic modelling is different than that obtained through maximising entropy production’, Haseli explains.
As a result, a state of chemical equilibrium won’t necessarily correspond to the maximum possible generation of entropy. Instead, once chemical equilibrium has been established, the total entropy generation will remain unaltered over time. This result could have important implications for next-generation technologies, which use hydrogen fuel to power gas turbines, fuel cells, and vehicles, with zero emissions of greenhouse gases.
New guidelines for entropy analysis
Further important scenarios can be found in a wide array of systems where the conversion of heat into mechanical work isn’t the primary goal. In his latest study, Haseli again points out how traditional approaches to the consideration of entropy production can’t be applied universally. ‘The article aims to provide guidelines on how entropy analysis should be conducted in thermal engineering to achieve conclusive results’, he says.
To demonstrate this, Haseli explores four different case studies: the evaporation of moisture and volatile compounds from biomass like wood and leaves, to improve their performance as combustible fuels; the use of electrical energy to separate air into pure oxygen and nitrogen; the design of processes which can produce electrical and thermal energy simultaneously; and the designs of system for producing pure hydrogen by splitting water.
In each of these cases, Haseli investigates the relationships between the performances of each system and the total entropy they produce. While his results confirm that the goal of minimising entropy production is often important for improving performance, this certainly isn’t always the case – ultimately meaning that irreversibility isn’t a universal indicator of imperfectness.
In the future, Haseli hopes that industries carrying out these processes, along with many others which also rely on calculations involving the second law, could improve their performance by following these guidelines. As the world transitions away from thermodynamic processes involving the combustion of fossil fuels, his results could be an important step towards ensuring the best possible performances in next-generation technologies.
- Haseli, Y, (2021) Interpretation of Entropy Calculations in Energy Conversion Systems. Energies, 14(21), 7022. doi.org/10.3390/en14217022
- Haseli, Y, (2019) Criteria for chemical equilibrium with application to methane steam reforming. International Journal of Hydrogen Energy, 44(12), 5766–5772. doi.org/10.1016/j.ijhydene.2019.01.130
- Haseli, Y, (2018) Maximum conversion efficiency of hydrogen fuel cells. International Journal of Hydrogen Energy, 43(18), 9015–9021. doi.org/10.1016/j.ijhydene.2018.03.076
Reassessing the effect of entropy for maximising the performance of thermodynamic systems.
Yousef Haseli received a PhD in mechanical engineering at Eindhoven University of Technology, in the Netherlands, followed by a postdoctoral position at MIT, USA. He has published one patent, two books, and 40 journal articles with more than 1,400 citations. He has received several awards including the Gold Medal of the Governor General of Canada.
Central Michigan University
Publication: Entropy Analysis in Thermal-Engineering Systems
SciPod science podcast: Traditional Equilibrium Models Lead to Inaccurate Predictions