There is such a mysterious and uncatchable quantity in physics, which is not directly measurable and it doesn’t even have a single definition because it changes depending on the framework in which appears. We’re talking about Entropy ( from ancient greek: inner transformation). Although entropy is an elusive concept, it’s curiosely ubiquitous, such as an invisible dark halo that surrounds and cloaks our universe so that it takes different appareances and facets from time to time. If it didn’t exist, physics and our universe would be very different. Entropy has appeared during the history of science, in different areas and moments, independently: it was first defined by thermodynamic during industrial revolution, then it was described by theoretical physics, until the 19th century, when it was introduced in the information theory and then extended almost to any other area of science. NOBLE ENERGY USELESS ENERGY In thermodynamics, as we’ re about to see, entropy measure the tendency of the energy in the universe to degrade and to spontaneously evolve towards messier and unusable energy forms for human activities. As a matter of fact, energy sources are not all the same: there are different types. As well as food, there are better sources of energy than others. In nature, the more energy is concentrated and stored in ordered structures, the more is exploitable. Kinetic energy, potential, chemical, nuclear, wind are all usable energy sources: these are suitable to produce forces for generating movements and doing work. Thermal energy, by contrast, is related to heat production and is considered unusable energy because it cannot be converted in exploitable energy. We’ll clear up why later. ENERGY IS CONSERVED, BUT ENTROPY DEGRADES IT During the first industrial revolution, the invention of the steam machine had generated great enthusiasm, and scientists and engineers of the time had been working hard, attempting to improve efficency of thermal machines, in order to produce energy for industrial purposes. All the efforts were aimed at realizing a dream: to build the perfect thermal machine, a device capable to convert the whole absorbed heat in usable energy, in a perpetual motion, with no loss. A possibility was very suggestive and debated: building a boat capable to absorb the heat in the sea and completely convert it in continuous propulsive energy. Physicists and engineers soon clashed with difficulties and experimental limits, which turned out to be sides of the same coin: different aspects straight related to the concept of Entropy. Hopes were nipped in the bud by British physicist William Thompson, also known as Lord Kelvin, who claimed an experimental principle, based on his studies on heat. This is now called the 2nd principle of thermodynamic: “It’s impossible to produce a transformation in which the only result is to absorbe an amount of heat and to completely convert it in work” This statement was also experimented years ago by Sadi Carnot. The French engineer, was one of the first at the beginning of the 18th century, to systematically study the working cycles of thermal machines. He guessed that their efficency should have a limit impossible to overcome because energy loss could never been eliminated. Carnot understood that heat and temperature were strictly related. He realized that it was precisely the variations in temperature that triggered the heat flows that allowed a thermal machine to do work and produce energy. He was among the first to perceive that the energy generated by a thermal machine, only depended on its working temperatures, while was totally indipendent from its technical and design features and from the substance used in order to perform the transformations (it could be a gas or another one fit for purpose). In 1864 the German physicist Rudolf Clausius, starting from Carnot’s work, claimed an asimmetry in the flow of heat: in nature, we observe that heat always flows from a hotter body to a colder one, while the reversal process doesn’t happen spontaneously. This seems obvious nowadays but there wasn’t any physical law that accounted for this. Starting from this observation Clausius stated a principle, which is today an alternative version of the 2nd principle of thermodynamic. “It’ s impossible to perform a transformation in which the only result is to transfer heat from a colder body to a hotter body”. Clausius understood that in order to do so, energy or work from the environment was needed: otherwise, the transformation would never occur spontaneously. Let’s think about our fridge: we realize that it can reverse the spontaneous flow of heat to cool down the food we put into. However, in order to do so, it must necessarily be connected to a power outlet and absorb energy, otherwise transformation would never occur spontaneously. And our fridge would not be able to perform the refrigerator cycle that let us preserve our food. Our fridge is nothing more than an inverted thermal machine. First prototypes were designed right this time Carnot, then Kelvin and Clausius, but also Joule, Clapeyron and other great minds , independently, realized that, whereas heat and work are energy forms strictly related, they are not simmetrically replaceable each other. In attempting to assess the energy loss within thermal machines, Clausius wanted to define a quantity to measure how much energy degrades in heat, during a transition. Thus he defined the variation of entropy of a system during a reversible process, as the ratio between the heat exchanged by the system and its temperature, held constant. All experimental observations and statements related to the 2nd principle, can be therefore summarized, by introducing entropy: “In every real termodynamic process, the entropy of an isolated system, can only increase”. By measuring the variations of entropy come out that every natural process evolves spontaneously by increasing its entropy: some heat is always produced and fatally more and more energy is unusable, untill it becomes totally degraded. Following the entropy principle, in a very distant future the whole energy of the universe is going to be unusable: at that point, temperature is going to be the same everywhere, and the thermal death of the universe will occur. Heat and work are very similar source of energy, but not interchangeable: they cannot transform symmetrically into each other. While work can always be completely converted in heat, without producing any other change, the other way around is not allowed in nature. Or, better to say, it could be theoretically possible but only at one condition: at the temperature of absolute zero. In 1848 Lord Kelvin had expressed the concept of absolute zero as the temperature at which matter is perfectly still without any thermal agitation motion. Here the 3rd law of thermodynamics joins the game. It was formulated for the first time between 1906 and 1912 by the German chemist Walther Nernst, and for this reason, it is also often called Nernst’s Theorem. It states that to reach absolute zero in a thermodynamic process involving a finite number of operations is not possible. Why it’s impossible to convert thermal energy in usable energy? This has a precise microscopic explanation. When a body is heated, thermal energy works by speeding up the velocity of its molecules, increasing mean kinetic energy, and consequently temperature. However, every single molecule of the body has a different agitation direction. If the kinetic energy of a macroscopic body is, in a certain way, directional, because all the molecules move in the same direction, by contrast, thermal energy is a form of messy and non-directional kinetic energy. What is the deep reason why entropy spontaneously increases? Why nature behaves this way? The great theoretical physicist Ludwig Boltzmann answered the question in 1871 giving a probabilistic insight based on the microscopic behavior of the particles. BOLTZMANN INSIGHT: ENTROPY LIKE DISORDER Boltzmann considered macroscopic systems (like a gas), as made up by a huge number of components, each one with a position and a velocity in every moment. He realized that every system spontaneously evolves starting from less likely states, towards more probable ones, and the most probable physical state is the balance. Let’s consider the free expansion of a gas in a recipient separated in 2 parts by a partition. At first, the gas is confined in one part, but when we eliminate the partition, we observe that spontaneously it starts to expand until it occupies all the available space in the recipient. We can observe that once the expansion has started , there’s no way to reverse the process: the gas will never return confined to the initial part of the recipient. There are no constraints preventing the gas to do it, but it’s highly unlikely that such a huge number of molecules (billions and billions), all move in the same part. It’s equal to flip the coin billions and billions of time obtaining heads every time! Therefore, it doesn’t happen spontaneously. It can be proved that the most likely configuration is that with the particles distributed across the recipient. Let’s now imagine a puzzle in a box, made up of thousands of pieces. There is only one configuration in which all the pieces together compose a complete picture. By contrast, there are a huge number of different arrangements where pieces are messy and don’t compose any picture or compose only a partial picture. Let’s imagine the complete puzzle in the closed box. After having shaken and reopened the box a few times, we observe that the arrangement of the pieces, from time to time, will be messier and messier. The system tends to evolve towards the most probable configurations: those with the most different ways to happen. The probability that after a shaking, the puzzle gets back to compose a complete picture is so low that never occurs. Boltzmann identified entropy with the probability to find a system in a certain state. He realized that the most probable states are those with more different ways to occur, those who contain a higher number of allowed microstates. These are all the possible combinations in the arrangement of the system components, associated with a certain macroscopic state. In other words, what can happen in many different ways has a higher probability than what can happen in only one way or in a few ones. You can lose to the lottery with all the combinations but one: only one wins. Mathematically, Boltzmann related entropy with the number of microstates associated to a macroscopic state: S= klogW We can demonstrate that this is equal to the definition of Clausius. In this way, an increase of entropy involves an increase in the number of microstates accessible by the system, and therefore an increase in disorder. This law is carved on his tombstone in the central cemeter of Vienna. The new probabilistic approach adopted by Boltzmann was fundamental because paved the way to a new physics. The founding fathers of quantum mechanics were afterwards strongly inspired by this new insight: from Planck to Einstein, from Schroedinger to Bohr. Entropy as information content Starting from the 20th century, entropy was extended in different research areas. In 1948, the American computer engineer Claude Elwood Shannon introduced entropy in the framework of the information theory, describing it as loss of information in a system. The more a system is well-ordered, the more is definable with as little information as possible. The more is messy and chaotic, the more information is needed to describe it. Shannon demonstrates that a closed system cannot increase its content of information: it will only decrease or, at best, stay costant in an ideal process. So entropy tell us that a cassette, a compact disc, un vinyl, a book, all these supports are going to lose their informations during the time, and won’t never recover their original amount anymore. There’s a funny way to resume the game of ENTROPY: by using the three principle of thermodynamics: The first one says that you cannot win. The second one says that you cannot even draw. The third one says that you cannot even not to play. We are forced to play a game we are going to lose!