Friday, October 30, 2020

Entropy as Irony

When I encounter popular scientific articles that incorporate statements such as this...
The Second Law defines the ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order. An underappreciation of the inherent tendency toward disorder, and a failure to appreciate the precious niches of order we carve out, are a major source of human folly.
... I see irony, not entropy. Here's why: to "carve out refuges of beneficial order" introduces, by the very process of "carving," a potential increase in entropy in the region outside of the refuges, whether that region is physical-chemical, informational, or aesthetic. Entropy, from the very beginning of its recognition and definition, is but part of a mathematical equation measuring an environment, a space, a domain in which energy is dispersed evenly throughout; if the energy is evenly dispersed, a measured quantity, for example, temperature, in any part of the surrounding region, is the same everywhere in that region.
If I enter into a virtual discourse, however cultivated and enlightened (such as this post), the visual signals I receive and read from my flat screen  take some degree of concentration to recognize individual letters, words, concepts; at every intervening point and moment require energy, part of which is converted to heat -- energy which is no longer be available to do the work of computing, displaying, and reading.
The presence of that heat is readily recognized by the sound of the fan (in the desktop or laptop) or warm spots of the underside of the laptop. Ah! one might exclaim, couldn't that heat inside the house help warm it when its cold outside? Yes, reducing in time, however minusculely, the prompting of the thermostat to turn on the furnace, but there will still be escape of some of that heat from the inside to the outside of the building.  
Another place to recognize heat produced as a byproduct by another device, is the heated air blowing from the back of the refrigerator into the larger kitchen. (Where does that heat come from? It is "removed" from the steaks, chicken breasts, cheese, milk ... whatever is inside of the appliance.) More importantly, heat energy is actually the vibration of molecules, which increases the temperature as the vibrations become faster and more powerful.

Some physicists and physical chemists define entropy as an increase in disorder. More than any other concept, the idea of disorder seems to have propagated into the non-scientific discourse, based, for example. on an imagined dormitory room; upon move-in, there is some manifestation of order -- books on the shelves, clothes in the drawers (arranged by the freshman's mother), shirts, pants, jeans hanging in the closet, towels on the racks on the back of the closet door... Three months, two months... whatever, that arrangement is gone. Therefore, entropy has increased. 

The problem with the dorm room analogy is the presumption of primary order. The freshman's mother defines order according to her standards; perhaps the student son even agrees that things are orderly as she defines it. To maintain that order, however, requires effort -- energy -- perhaps even more than just allowing things to fall where they may. Conversely, the loss of that original arrangement might require additional effort to locate desired objects (where did I put that polo?) in the future. In either case, more energy is required, both physical (folding and organizing or plowing through the pile) and mental; and part of that energy is lost to the environment. The presumption that spending time organizing and re-organizing (after each laundry load emerges from the clothes dryer) involves less effort than plowing through a disorganized pile each morning may be correct -- over time, perhaps less net energy is required if an arrangement is maintained -- or it may not (organizing may require a lot of time and thought: arrange by object kind, size, color...? Hang matching shirt and pants together or all shirts separately from all pants?) Of course, wearing the same outfit day after day should result in less net energy loss than all that effort to organize after each dryer load (especially if each dryer load occurs less often). So much for the dorm room... 

To characterize entropy in terms of (relative) disorder requires an accepted standard for order: the freshman's mother's standards, foundational scientific laws, logical premises...  




Friday, August 02, 2019

Billy Jo/Joe

The episode of Burke's Law was "Who killed Billy Jo"? The actor who played the unfortunate Billy Jo was Kelly Gordon; according to IMDB Gordon has three acting credits plus one uncredited role. His last apparent role was the murder victim.
So what else did this man do after BL? He was young, only 30 or 31 then.
A search for him leads to a Kelly Gordon in music, both composer and producer. His best-known piece is "That's Life," cowritten with Dean Kay and readily identified with Frank Sinatra's 1966 version.
As a music producer, he is credited with Bobbie Gentry's Ode to Billy Joe. Coincidence? Two "Kelly Gordons"; two "Billy Jo(e)'s"? No - one Kelly Gordon: same dates of birth and death are indicated on imdb and revolvy websites. And, he sang his own song as Billy Jo in Burke's Law.
Gordon died young, so the question about dual Billy Jo/es could only be answered in this life by Bobbie Gentry.

Saturday, March 23, 2019

Nights (and days) around the Round Table

Roaming Rita's latest post is delightful. And not just because it is about our family. When family gets together to break bread, whether at home or away, it's great that everyone has a great time. Whether we like one another or not, a party in which the hosts do all they can to make everyone feel welcome is a foreshadowing of a promised party at the end-of-time. It's a party to which all are invited.
BBQ, birthday party, Christmas Eve, Easter Sunday, St. Patrick's Day, anniversary, Thanksgiving, lunch, dinner, Saturday morning pancakes... "Life is a banquet, and most poor suckers are starving to death!" And shared meals, from breakfast to banquets are fractals.


    Mystification

    The Bulwark mystifies me. Even after communicating with one of its principals who emphasizes in return, "The Bulwark is not the Weekly Standard," the presence of a significant number of clearly non-conservative writers is a puzzle. Does "never-Trump" mean "anyone but Trump"?
    The initial logo for the site, includes the subtitle, "Conservatism Conserved."
    As of Today (?) or earlier this week, I think, the subtitle has changed:
    "Slightly Dangerous" -- that clarifies things.

    Thursday, September 27, 2018

    In defense of Jonah...

    I have been admirer of Jonah Goldberg for two decades. I read his columns and have all of his books.
    His most recent book, Suicide of the West has been getting hammered by reviewers from the orthodox Catholic side, I think unfairly.
    Such reviewers jump on the, yes, surprising and discomfiting assertion Goldberg offers up front - "There is no God in this book." Maybe because of my longtime familiarity with his writing, I knew immediately that this would prove to be false. And, I think that even his post-publication explanations, that he is trying to reaching the potentially persuadable non-believers is tongue-in-cheek, at least in part.
    Of course God is in the book; he can be found on every page. And, like his OT namesake, this current prophet knows it; he's just playing, you see. The engine of modernism, free enterprise, is driven by mutually recognized dignity - of the entrepreneur, employee, and customer, and, oh, the competitor, too.
    Tag! reviewers - you're it. He gotcha. 
    And, if any self-described liberals/progressives do actually pick-up the book because of the "bad" reviews from the red side, maybe they, too, will be ensnared.

    Monday, January 13, 2014

    In the appendix of the paper I reference below is a summary derivation of the relationship  between entropy and fractals (power laws). The appendix is also posted in another blog, Plate Frames. The introduction to the post reads...
    In a paper published a couple of years ago (Pilger, 2012), I describe the application of a simple principle, transformed into a distinctive abstract object, to an optimization problem (within the plate tectonics paradigm): simultaneous reconstruction of lithospheric plates for a range of ages from marine geophysical data . It is rare that the relation of the principle, maximum entropy, with a particular transformation, power-series fractals, is recognized, since Pastor-Satorras and Wagensberg derived it. I'm unaware of any other application of fractal forms to optimization problems analogous to the paper. The following derivation is taken from the 2012 paper, with slight modification, in hopes that it might prove useful in other fields, not merely the earth sciences, but beyond. I'm investigating  applications in a variety of other areas, from plate tectonics, to petroleum geology, and, oddly enough, the arts.
    Pilger, R. H., Jr. (2012) Fractal Plate Reconstructions with Spreading Asymmetry, Marine Geophysical  Research, Volume 33, 149-168. dx.doi.org/10.1007/s11001-012-9152-6. (rexpilger (at) gmail (dot) com.)

    Wednesday, December 22, 2010

    Fractals and Plate Tectonics

    Can fractal criteria be used in deriving plate reconstructions of asymmetrically spreading ridges? See: link.


    Wednesday, November 17, 2010

    Peer Review

    An article in Physics World describes an "experiment" in peer review and its effect on the quality of published scientific research.
    Just a small number of bad referees can significantly undermine the ability of the peer-review system to select the best scientific papers. That is according to a pair of complex systems researchers in Austria who have modelled an academic publishing system and showed that human foibles can have a dramatic effect on the quality of published science.

    Monday, November 08, 2010

    Jane and Will

    Fractal calculations of Jane Austen's six novels imply an unsurprisingly common vocabulary pool. But, what about comparisons with other English literature? Hamlet:

    And adding to the Jane Austen plot:

    (Click to enlarge; click again to zoom; backspace to return to this post.)
    
    Hamlet and company occupied a smaller "area" in their dramatic fractal space, but note that the slope of the main part of the fractal plot is essentially the same as Jane's novels.

    What does it all mean?
    
    

    Jane Austen

    Word frequency usage often shows a logarithmic pattern (e.g., Zipf's distribution). What about fractals? (Click to enlarge graphics; backspace to return to this blog post.)














    I suppose one might assume that Jane Austen would draw on the same vocabulary in each novel. How similar are these relationships among the six?

     
    The similarity in slopes of the six curves suggests that common vocabulary.

    Citation Fractals

    Following a suggestion by Murray (2002), I've looked at the indexes of a number of recent scientific monographs and popular scientific accounts and calculated their fractal measures.


    Murray combined a large number of text references and normalized them. Applying fractal binning to his results produces:

    E. T. Jaynes (2002) Probability Theory:




    W. Isaacson (2008) Einstein: His Life and Universe:




    L. Gilder (2009) The Age of Entanglement:



    S. Pinker (2003) The Blank Slate:




    W. Grandy (2008) Entropy and the Time Evolution of Macroscopic Systems:

    Sunday, November 07, 2010

    Looking for fractals in all the wrong or right places - I

    Here is the first of several attempts at documenting fractal structures from science to art.


    First, however, a little bit about technique: The magnitude of a particular data set, whatever its source, is ranked from greatest to least. Then, fractal binning is applied over a range of dimensions. The maximum dimension is D(max) = 2*n, in which the value of n produces the smallest value larger than the maximum magnitude of the data set. Each dimension is then determined by D(I) = D(maximum)/M, in which M = 2*N/M, M =1, M*. Each bin I for dimension D(I) is occupied if there is one or more values such that bin I = integer (magnitude/D(i)). Then for each dimension, the number of occupied bins is totaled.
    First example: Number of performances of Broadway musicals, ranked from most to least, for the top 100 (not counting some which are still running).
    The longest running musicals (Andrew Lloyd Webber's Cats is the current leader among closed shows; however his Phantom of the Opera is still running).

    Note that a true fractal would display a linear trend on a log-log plot. However a simple linear trend is apparent for only the top two bins, not one extended over a longer range of scales.

    Entropy

    Where were you when you first heard about entropy?


    Was it in high school chemistry or physics (the most appropriate venue)? Perhaps, metaphorically, it was in college English or political science, or even an op-ed in last year’s Times (NY, LA, or London). What, you say you never heard of entropy before? It’s time to climb onto the bandwagon – there’s a new paradigm in town, recasting a nearly two-hundred year old idea. Some of the “newer” ideas about entropy are more than half a century old, introduced shortly before “paradigms” became paradigmatic, while some of the newest arrived just before the turn of the millennium.

    Tectonic Similarity

    Spin a globe, tilt it, and center the South Atlantic Ocean, with the coast of South America to the west (and left), that of Africa on the east (and right): see how the South American coastline to the west seem somehow similar to that of Africa on the east. Tilt north and rotate slightly west, to the center of the North Atlantic Ocean: the North American coastline to the west is, with slight imagination, similarly similar to that of the North African Atlantic coast. One more time: east and south, to the center of the Indian Ocean: The facing coasts of East Africa, India, Antarctica, and Australia; rotated back and forth a bit and imagine the coast lines as edges of spherical puzzle pieces, with Madagascar a gap-filling fragment. Might all of these continents, and Eurasia too, have once formed a collective megacontinent? This question has been around in one form or another for not merely decades but two or even three centuries. But, it wasn’t until the mid-1960s that the solid earth scientific community reached near consensus: The answer: a resounding “Yes”.
    However, to reach the point at which geologists (stratigraphers, petrologists, volcanologists…), geophysicists (seismologists, paleomagnetists…), and paleontologists (specializing in fossils of plants and both vertebrate and invertebrate animals) could all agree, multidisciplinary results from each field had to be shown as mutually consistent and integrated. 

    Misquotation: How difficult is it?

    • “Play it again, Sam.” -- Ingrid Bergman
    • “…blood, sweat and tears.” – Winston Churchill
    • “History is bunk.” – Henry Ford
    • “My name is Ishmael.” – Herman Melville
    • “Math is hard.” -- Barbie
    These famous quotes have something profoundly in common. What is it?
    There could be a footnote, upside down, at the bottom of this page, or an end note, somewhere deeper into the blog, with the answer. But, you, dear reader, know the answer, don’t you? Each quote is similar to each of the others. Isn’t it? Aren’t they all similar?

    Woody Allen even made a movie with that title, a classic 70’s rock band had that name, who believes history anyway(?), and Ahab was an Arab, wasn’t he? And, mathematics can be difficult,

    Mathematical notation can be obscure, Sam really didn’t want to play it, the Prime Minister was playing with a short deck, Ford manufactured the “T” before the “A”, and Melville’s story is a whale of a tale.

    Psst…. Don’t tell anyone; don’t include “warning: spoilers” in your review. But you do know, don’t you, that none of the quotes above are original with their auteur? That’s because none of the authors ever said or wrote any of them.

    Here are the “real” quotes:
    • “Play it Sam. Play ‘As time goes by’” – (Ilsa) Ingrid Bergman in Casablanca
    • “…blood, toil, sweat and tears.” – Winston Churchill (and before him Garibaldi and T. Roosevelt)
    • “History as it is taught in the schools is bunk.” – Henry Ford (In fairness, there appears to be some disagreement about what the innovative industrialist really said.)
    • “Call me Ishmael.” – Herman Melville
    • “Math class is tough.” -- Barbie
    It is ironic, is it not, that the most quoted line from the most quotable movie of all time is commonly misquoted (even before Woody Allen’s movie). A search for "Play it again Sam" produces 230,000 hits, while a search for “Play it Sam. Play ‘As time goes by’” produces only 69,000 more. Further, even Churchill’s well-known line may have been appropriated from (or at best independently enunciated after) Theodore Roosevelt. Henry Ford came after Karl Marx, so the familiar assertion by the great capitalist could be viewed as a denunciation of the patron saint of communism and his “theory” of history. There is a slight, even significant discrepancy between “My name is…” and “Call me…” is there not?

    In any case, whether for miniature mannequins or fully grown adults, math can be really hard. Even Einstein, physics genius, needed help with his math at times.

    Oops, I forgot one:
    • "Judy, Judy, Judy..." Cary Grant
    Can anyone find anything close to "Judy..." by C. Grant anywhere in his oeuvre? Not even Tony Curtis in either Some Like It Hot or Operation Petticoat came close to it.

    Saturday, November 06, 2010

    Understanding Entropy

    The term, entropy, is a mystery to many, and the concept it encapsulates is a source of confusion for still more, particularly in ways in which it has been more broadly applied. The concept of entropy was introduced into that branch of physics that studies changes in temperature, pressure and volume of physical systems at a macroscopic (as opposed to microscopic) scale: thermodynamics (combining the Greek words for heat, thermos, and power, dynamis).


    Simply (!) speaking, entropy, in thermodynamics, is a measure of the amount of heat in a physical system that is unavailable to do work. As a measure of this inaccessible heat, the “absolute” temperature is also relevant. If the system gives up a particular amount of heat, and its entropy decreases by a particular amount, the product of the temperature of the region surrounding the system and the change in entropy must be given up to the system’s surroundings as unusable heat, thereby increasing the entropy of the system and its surroundings..


    The description of entropy above seems abstract; to illuminate it, consider the nature of energy and work. In order for a quantity of energy to be available to do work, that energy must be organized or concentrated. Consider a tea kettle on a burner. The heat introduced into the system by burning gas raises the temperature of the water in the kettle until it boils, converting liquid water to gas: steam. The steam increases the air pressure within the kettle and forces it through the whistle in the spout of the kettle (thereby doing two kinds of work); however, part of the heat is lost to the kettle, escaping into the kitchen. Further, when the gas feeding the burner is turned off, boiling stops, and if the kettle is left unattended, the hot water and kettle will gradually cool by giving off heat to the surrounding room. The concentration of heat in the kettle when the heat source is removed, combined with the rest of the system – the kitchen, is a state of lower entropy than later, when the kettle and water have cooled, warming the kitchen, however slightly. Getting the lost heat (and steam) back into the kettle is not possible without expending still more energy. After all, the initial heating of the tea kettle involved introduction of heat into the system.


    Recognition that energy must be concentrated or organized in order to do work gives rise to an additional property of entropy: it is a measure of disorganization. But, is entropy a concept produced by reductionism?


    More complete analyses of a thermodynamic system have superficial aspects of a reductionist approach. That is, envision a very large number of particles comprising a closed system, and “calculate” the motion of each individual particle as they interact with one another and the boundaries of the system. Very quickly it’s clear that calculating even just a few particles’ motion (speed and direction) as they interact with one another is beyond the capacity of not only a laptop computer, but also the super-most of supercomputers. So much for reductionism? Not so fast. A more complete theoretical analysis indicates that much of the motions in effect cancel each other out. No, the motion of each and every individual particle is beyond computational capability, but the collective properties of the ensemble of particles can be characterized: the thermodynamic properties of the system (if they are in equilibrium) are determinable: temperature, pressure, volume. Is thermodynamics a reductionist success?


    Whether entropy is a consequence of reductionist science, it is, nevertheless, a scientific success. And, decades after Boltzmann and Gibbs, entropy emerged in a new guise, with implications beyond that of thermodynamics.


    Information Entropy


    Software companies with a large, diverse portfolio are challenged to meet customer expectations while efficiently utilizing the talents of designers, developers, and testers. Add in multiple development locations, outsourcing, multilevel integration, and competitors: software producers are faced with simultaneously juggling double-edge swords of complexity and uncertainty.


    Complexity and uncertainty conjure two active areas of investigation over the past few decades, especially invoking the mysterious concept of entropy and the fascinating phenomena of fractals. Fractals are recognized as emergent phenomena in the midst of complexity theory – organized chaos. Entropy is primarily identified with increasing disorder in a closed system: the Second Law of Thermodynamics, and is homologously invoked in the context of information theory. Inferences by Edwin Jaynes, beginning in 1957 (in 1983 reference, below), of formal equivalence of Gibbs thermodynamic (statistical mechanical) and Shannon information entropy continue to be debated in diverse scientific and engineering fields.


    Can the complexity and uncertainty that are inherent in software development acquire insight through recognition and application of entropy and fractals? More fundamentally, is there a relationship between entropy and fractals?

    Sonnet Similarity

    Human communication depends upon the comprehension of meaning. And, the ability to understand is a combination of the senses, experience, and memory. So much of reasoning relies on analogy. “What is she like?” Contrast two of Shakespeare’s sonnets:

    18

    Shall I compare thee to a summer's day?

    Thou art more lovely and more temperate:

    Rough winds do shake the darling buds of May,

    And summer's lease hath all too short a date:

    Sometime too hot the eye of heaven shines,

    And often is his gold complexion dimmed,

    And every fair from fair sometime declines,

    By chance, or nature's changing course untrimmed:

    But thy eternal summer shall not fade,

    Nor lose possession of that fair thou ow'st,

    Nor shall death brag thou wander'st in his shade,

    When in eternal lines to time thou grow'st,

    So long as men can breathe, or eyes can see,

    So long lives this, and this gives life to thee.

    130

    My mistress' eyes are nothing like the sun;

    Coral is far more red, than her lips red:

    If snow be white, why then her breasts are dun;

    If hairs be wires, black wires grow on her head.

    I have seen roses damasked, red and white,

    But no such roses see I in her cheeks;

    And in some perfumes is there more delight

    Than in the breath that from my mistress reeks.

    I love to hear her speak, yet well I know

    That music hath a far more pleasing sound:

    I grant I never saw a goddess go,

    My mistress, when she walks, treads on the ground:

    And yet by heaven, I think my love as rare,

    As any she belied with false compare.

    “What is she like?” “She’s a tower of ivory; a day at the beach; warm as a handshake; yet cold as a sunflower.” The answers to the question, “What is she like?” have meaning only to the extent that there are connotations in the analogies that can be applied to the character of a woman. Is there any similarity at all (Sonnet 18), or anti-similarity (Sonnet 130)? To a contemporary westerner, “tower of ivory” has little meaning, unless the Song of Songs is familiar. “A day at the beach” might mean something different to a Southern Californian and an Eskimo. And, how could a sunflower be “cold”.


    Successful communication occurs when the sender and receiver speak the same language – in which the message, even if novel, can be understood. There must be a pre-existent capacity for receiver to comprehend what was sent. And, this capacity must also include the ability to learn – to receive progressive more complicated and elaborate messages. In other words, for communication to occur there must be some similarity between the vocabulary and experience of the speaker and those of the hearer.

    So, what is the meaning of similarity, when we find it in Nature? We certainly understand that organisms inherit their form from their parents. And, an ecological niche can be occupied by organisms of different lineages, which, nevertheless, develop analogous structures – for example, dorsal fins on sharks (fish) and dolphins (mammals), wings on insects and birds, and “wings” on bats and flying squirrels.


    Such similarities can also be found in the inorganic realm. Crystals are a manifestation of the molecular structure of particular solid elements and compounds. Such similarities in crystals of different sizes generally involve distinct symmetries: cubic, orthogonal, tetrahedral, hexagonal… Table salt crystals (sodium chloride; in solid state the mineral halite) manifest cubic symmetry at virtually all scales; break a halite crystal and it fragments into smaller crystals bounded by surfaces which terminate at right angles with adjacent surfaces.


    In recent years, another kind of self-similarity has been identified, of which crystals are a subset: fractals. In Mandelbrodt’s formulation, fractal objects appear to have the same structure over a wide range of scales. In addition to a number of mathematical algorithms that can produce fractals, a number of naturally occurring examples can be enumerated: clouds, earthquake occurrences, shorelines, gas-water contacts in natural gas reservoirs. The latter examples differ from crystals in that they do not possess inherent geometric symmetries. Rather, while structures or phenomena at various scales show similar variations in some quantity, they are not necessarily congruent when rescaled.


    Certain kinds of fractals have a scaling dimension that quantifies the variation similarity. For example, consider a set of earthquakes that occurs in a particular earthquake-prone region of the earth, such as Southern California, over a prolonged period of time. If the numbers of earthquakes are grouped according to their magnitudes, a simple relationship is observed. Earthquakes of small magnitudes are much more frequent than those of large magnitudes (magnitude is related to the amount of energy released by the earthquake). In fact, a plot of the logarithm of the number of earthquakes of a range of magnitudes versus the magnitude range produces a straight line 9 fig. 1). The similarity comes in with the observation that the relationship is log-linear; that is the change in number of earthquakes magnitude versus magnitude is the same at low magnitudes and at high magnitudes. The slope of the line in the logarithmic plot is a measure of the fractal dimension of the earthquake magnitude distribution.


    Plot of magnitude versus logarithm of frequency of earthquakes (binned in intervals of 0.25 magnitude units), southeastern California, 1980-2005 (data courtesy of Southern California Earthquake Center, University of Southern California).


    Another natural example of fractality is the dimensionality of measured coastline lengths. If the coastline is measured coarsely, the total length is less than if measured finely. A plot of the logarithm of measured coastline length versus the relative fineness of measurement produces a straight line, whose slope is the fractal dimension.


    Recently, there has been shown to be a relationship between fractals and the curious concept of entropy. Fractals can be shown to maximize mathematical entropy across multiple scales, constrained by the information content of the fractal. This correspondence has interesting implications for the interpretation of self-similarity in Nature, human creativity, and Revelation(!).

    Next: Understanding Entropy

    Analogy in Science

    The discovery (in a realistic sense; the subjectivist would say “formulation”) of laws and their manifestation are the essential tasks of the physical sciences: physics, chemistry, astronomy, geology, biology… Those laws which are continuously affirmed and reaffirmed by experiment and/or observation comprise the standard theories of their science. Much of normal science (in Thomas Kuhn’s sense) is the elaboration of theory, especially definition of the domain of its application. Thus, Newton’s law of gravitation finds application within the range of observation capabilities of the late Seventeenth Century; in modern terms this means objects moving at relative velocities significantly less than the speed of light and minimal effects due to other forces (electromagnetic, Strong, and Weak). At high relative velocities, Einstein’s theory of relativity comes into play, while the non-gravitational forces are significant at molecular and smaller scales.


    Whether expressed in words (the gravitational force between two objects is proportional to the product of their masses and inversely proportional to the square of the distance separating them) or algebraically (G =  m1 m2 / r2), the words and alphanumerical characters are analogs for the inferred law, and, are literally symbolic. Yet, while symbols in mathematical context, do not have the same symbolic depths or meaning as religious and mythic symbols (such as the Cross or the grail), in non-scientific contexts, “Newtonian gravitation” does evoke symbolic depths, especially in a Calvinistic mechanical worldview. Hierarchies of causation are implied by the equality character (=).


    Even before the onset of relativity and quantum mechanics, mathematicians and physicists began to recognize that pure equality is not observed in nature. So, for example, measured gravity is only approximated by Newton’s law:








    where +/- e means plus or minus “error”. The so-called error term in any physical equation can incorporate several effects. If a measurement differs significantly from that predicted by theory (e is large), (1) other phenomena might be contributing to the observed effect, (2) the measurement device is poorly designed, or (3) the theory might be wrong or at least inappropriately applied. For example, (1) measured gravity on the surface of the earth is affected by the distribution of mass within the earth, rotation of the earth, lunar and solar effects (“earth tides”, due to the masses of the moon and sun), and planetary gravitational effects (to a much lesser extent than lunar and solar). For (2) early pendulum “gravity meters” were big, awkward, and imprecise; more precise and accurate meters have been developed in more recent years. And for (3), at high velocities, relativity must be taken into account; Newton’s law is inadequate.


    Another expression of inexactitude, instead of the error term, could involve the “approximately equal” character (≈). A physicist would be reluctant to interpret the character as meaning “similar”, but it is close. The gravitational constant term, , in Newton’s law, is the proportionality factor that could also be thought of as a similarity coefficient. The reluctance of a scientist to use “similarity” in either case is largely due to the fact that Newton’s law produces a scalar (that is, single-valued). Where similarity more comfortably comes into play is in the comparison of multi-valued objects or data sets. For example, there are measures for comparing two digital electronic signals (ordered sets of numbers) – cross correlation, semblance, and coherency. Sometimes, a signal (or ordered set) possesses some sort of self-similarity, either in a repetitive, constant frequency pattern (for example, a musical note) or a structure that replicates itself at a range of different scales.


    Next: Sonnet Similarity