Was it in high school chemistry or physics (the most appropriate venue)? Perhaps, metaphorically, it was in college English or political science, or even an op-ed in last year’s Times (NY, LA, or London). What, you say you never heard of entropy before? It’s time to climb onto the bandwagon – there’s a new paradigm in town, recasting a nearly two-hundred year old idea. Some of the “newer” ideas about entropy are more than half a century old, introduced shortly before “paradigms” became paradigmatic, while some of the newest arrived just before the turn of the millennium.
Chances are that if and when you were introduced to entropy that initial encounter, if you thought about it at all, ruined your day. (In high school freshman general science, after presenting the Second Law, a quiz was in store: A refrigerator in a closed, insulated room is running and the door of the major appliance is left open. What happens to the temperature of the room? I knew the correct answer, but I couldn’t remember why. Scientist that I was to become, I hated memorization without understanding, so I answered: the temperature remains constant – the heat escaping from the back of the unit would be cancelled out by the cooling within. WRONG! The temperature, of course, increases – doesn’t everyone know that? For the “why,” check below.)
The Second Law is, superficially at least, a philosophical downer (not because of a missed quiz question), but because of its more profound implications. If it was a science class for you, the instructor may have illustrated entropy as “a measure of the eventual decay of the Universe” or, more recently, an indicator of “black hole death”. Perhaps it is appropriate that one of the critical formulations of entropy is carved into a Viennese tombstone. Teachers love to shock their students, in an effort to create interest in understanding ideas being advanced; but, there is nothing like entropy to drive a “bright” young intellect away from physics or chemistry.
Entropy hasn’t driven everyone away, though; there is a continuing fascination with it. On July 9, 2007, a GoogleTM search yielded “about 14,100,000” (14.1 x 106) results for entropy. Yahoo!TM gave 5.460 x 106 links. Ask.com produced 2,215,000 results.
The unofficial, if not always authoritative, encyclopedia of the Internet, Wikipedia (http://en.wikipedia.org), has an extensive and growing article entitled Entropy (or Thermodynamic entropy). And there are more such web pages.
Let entropy be represented by either the letter H or S. Within Wikipedia there are additional entropic topics: Information H, Introduction to S, S (classical thermodynamics), S (statistical thermodynamics), Boltzmann S, Gibbs S, von Neumann S, Information H, Shannon H, plus a variety of more esoteric H or S entries.
Thermodynamic entropy, according to Wikipedia, has been described variously (not necessarily in a satisfactory manner) as a measure of:
• Thermodynamic homogeneity of a system…under the additional constraint of thermodynamic equilibrium.
• Smoothing-out process progress
• Disorder in state
• Dispersal of energy
• Useless energy
Entropy, in more general terms, has also described as measuring:
• The number of micro states that are consistent with an observed macro state.Entropy is the key component of the Second Law of Thermodynamics: The total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value. (The First Law states that energy in an isolated system is conserved.)
• Amount of uncertainty or “mixed-upness”
• The degree to which the probability of a system is spread out over different possible quantum states.
• Our ignorance about a system.
• A fundamental physical property of an element (“standard molar entropy”)
• Mixing of substances.
Thus Wikipedia can state the seemingly depressing consequence: The entropy of the universe (i.e., everything), assumed to be an isolated system (because it includes everything), tends to increase. Thus, homogeneity, smoothing-out, disorder, and the dispersion of energy to uselessness are the ultimate fate of our universe (if it is, indeed, closed). Of course, given an expected lifetime of the universe on the order of at most tens of billions of years, well beyond the extremely likely, much shorter duration of our own individual mortal lives, not to mention how recently homo sapiens emerged from the jungle or savanna, a depressed sense concerning the ultimate fate of all that is is, frankly, only an emotional response, after all… isn’t it?
So is there anything about entropy that is worth knowing (viz. C. P. Snow) and that provides any kind of good motivational news, especially for the most sensitive among us?
Depending on the source (and even Wikipedia contributors appear to disagree), thermodynamic entropy represents a special case of statistical entropy and, more generally, informational entropy. The development and application of these three varieties of entropy provide intriguing insight into not only the nature of physical laws, but also all the diversity of ways in which humanity interacts with the world and, individually, with one another. Understanding entropy, in each of its dimensions, means acquiring a deeper sense of:
• The nature as well as content of scientific theorySymbology
• The actual way we “do” science
• Artistic creativity
• History
• Psychology
• Social communication
• Economic interaction
• Perhaps even, the meaning of humanity – that is, ourselves – and, indeed, the rest of creation.
This volume is an extended argument. The next few sections comprise the first part of the argument, expressed in some very necessary mathematics. Some fundamental hypotheses of the early Nineteenth Century became the foundation of modern thermodynamics. Extensions of these hypotheses into the Twentieth Century, especially in combination with developing molecular theory, became modern statistical mechanics. Further elaboration found application in quantum mechanics. Then, in mid-Century, seemingly out of left field, came analogous concepts, information theory, with application in a wide spectrum of scientific and engineering enterprises. Over the past few decades other advances in the analysis of what has come to be called “chaos theory” developed touch-points with statistical mechanics and information theory, with possibly surprising implications. To express much of this evolution requires the language of mathematics, especially algebra and calculus.
Is it possible to understand concepts such as entropy without mathematics? Perhaps a better question is whether entropy in all of its pertinent perturbations and relationships can be articulated without algebra. Algebraic expressions are condensations of mathematical concepts. Mathematics can be communicated in conventional English (or any other modern language), but it’s not easy to comprehend sequences of elaborate relational expressions in series of sentences and paragraphs. On the other hand, what can be demonstrated mathematically, in a sequence of theorems, for example, might still be described, analogically, without repetition of proof, for the non-mathematically inclined. In the next few sections, then, mathematical elaborations are accompanied by textual description of the significance of the elaborations.
No comments:
Post a Comment