Saturday, November 06, 2010

Understanding Entropy

The term, entropy, is a mystery to many, and the concept it encapsulates is a source of confusion for still more, particularly in ways in which it has been more broadly applied. The concept of entropy was introduced into that branch of physics that studies changes in temperature, pressure and volume of physical systems at a macroscopic (as opposed to microscopic) scale: thermodynamics (combining the Greek words for heat, thermos, and power, dynamis).


Simply (!) speaking, entropy, in thermodynamics, is a measure of the amount of heat in a physical system that is unavailable to do work. As a measure of this inaccessible heat, the “absolute” temperature is also relevant. If the system gives up a particular amount of heat, and its entropy decreases by a particular amount, the product of the temperature of the region surrounding the system and the change in entropy must be given up to the system’s surroundings as unusable heat, thereby increasing the entropy of the system and its surroundings..


The description of entropy above seems abstract; to illuminate it, consider the nature of energy and work. In order for a quantity of energy to be available to do work, that energy must be organized or concentrated. Consider a tea kettle on a burner. The heat introduced into the system by burning gas raises the temperature of the water in the kettle until it boils, converting liquid water to gas: steam. The steam increases the air pressure within the kettle and forces it through the whistle in the spout of the kettle (thereby doing two kinds of work); however, part of the heat is lost to the kettle, escaping into the kitchen. Further, when the gas feeding the burner is turned off, boiling stops, and if the kettle is left unattended, the hot water and kettle will gradually cool by giving off heat to the surrounding room. The concentration of heat in the kettle when the heat source is removed, combined with the rest of the system – the kitchen, is a state of lower entropy than later, when the kettle and water have cooled, warming the kitchen, however slightly. Getting the lost heat (and steam) back into the kettle is not possible without expending still more energy. After all, the initial heating of the tea kettle involved introduction of heat into the system.


Recognition that energy must be concentrated or organized in order to do work gives rise to an additional property of entropy: it is a measure of disorganization. But, is entropy a concept produced by reductionism?


More complete analyses of a thermodynamic system have superficial aspects of a reductionist approach. That is, envision a very large number of particles comprising a closed system, and “calculate” the motion of each individual particle as they interact with one another and the boundaries of the system. Very quickly it’s clear that calculating even just a few particles’ motion (speed and direction) as they interact with one another is beyond the capacity of not only a laptop computer, but also the super-most of supercomputers. So much for reductionism? Not so fast. A more complete theoretical analysis indicates that much of the motions in effect cancel each other out. No, the motion of each and every individual particle is beyond computational capability, but the collective properties of the ensemble of particles can be characterized: the thermodynamic properties of the system (if they are in equilibrium) are determinable: temperature, pressure, volume. Is thermodynamics a reductionist success?


Whether entropy is a consequence of reductionist science, it is, nevertheless, a scientific success. And, decades after Boltzmann and Gibbs, entropy emerged in a new guise, with implications beyond that of thermodynamics.


Information Entropy


Software companies with a large, diverse portfolio are challenged to meet customer expectations while efficiently utilizing the talents of designers, developers, and testers. Add in multiple development locations, outsourcing, multilevel integration, and competitors: software producers are faced with simultaneously juggling double-edge swords of complexity and uncertainty.


Complexity and uncertainty conjure two active areas of investigation over the past few decades, especially invoking the mysterious concept of entropy and the fascinating phenomena of fractals. Fractals are recognized as emergent phenomena in the midst of complexity theory – organized chaos. Entropy is primarily identified with increasing disorder in a closed system: the Second Law of Thermodynamics, and is homologously invoked in the context of information theory. Inferences by Edwin Jaynes, beginning in 1957 (in 1983 reference, below), of formal equivalence of Gibbs thermodynamic (statistical mechanical) and Shannon information entropy continue to be debated in diverse scientific and engineering fields.


Can the complexity and uncertainty that are inherent in software development acquire insight through recognition and application of entropy and fractals? More fundamentally, is there a relationship between entropy and fractals?

No comments: