Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Michael B. Elzinga Kalamazoo Area Mathematics and Science Center A Presentation for Science Caf The Ohio State University at Marion October 5, 2010
Prior to About 1970: Public Impressions About Thermodynamics Were Hazy But Not Wrong
In the US, the publics (usually males) ideas about thermodynamics were approximately:
No perpetual motion machines Cant get 100% efficiency Useful energy degrades (entropy???)
The US love affair with automobiles and machinery may have had some influence.
Public understanding of thermodynamics concepts took a dramatic wrong turn. No longer references to energy and efficiency. Conflations: entropy, disorder, information. Corrective attempts by science community generally unsuccessful.
Everything runs down, decays, reverts to simpler forms or states, and everything becomes disorganized into complete chaos, which is the highest state of entropy.
Messy rooms, rust and wear Tornado in a junkyard (raw energy > 747s) Aging and death, mutations, copying errors Genetic entropy (e.g., wolf to Chihuahua)
Order and information are the opposite of entropy (entropy is a measure of disorder).
Open systems are even worse than closed systems for producing order.
Thus the 2nd law is claimed to be a barrier to order and specified complexity or increased information.
Some kind of work must be done to overcome the law of entropy (the 2nd law) and climb uphill toward order and specified complexity or information (lower entropy). Functionless crystals are low energy order; living organisms are high energy order. Some kind of intelligence must do the planning.
Some kind of information or program must guide, be contained within, or flow through such a system to organize it and keep it from following the natural trend of all things in the universe toward higher entropy. Therefore, physics and chemistry cannot explain organization, complexity, functionality, and life in the universe.
Henry M. Morris, Duane Gish, Gary Parker, et. al. Saturation bombing of the media. Narrative: Pit the myth of evolution against the science of thermodynamics.
Dembski, Behe, et. al. and their treatments of the complex assemblies of molecules.
From Evolution, Thermodynamics, and Entropy by Henry Morris of the Institute for Creation Research (http://www.icr.org/articles/view/51/247/)
The very terms themselves [Morris is speaking of evolution and entropy] express contradictory concepts. The word "evolution" is of course derived from a Latin word meaning "out-rolling". The picture is of an outward-progressing spiral, an unrolling from an infinitesimal beginning through ever broadening circles, until finally all reality is embraced within. "Entropy," on the other hand, means literally "in-turning." It is derived from the two Greek words en (meaning "in") and trope (meaning "turning"). The concept is of something spiraling inward upon itself, exactly the opposite concept to "evolution." Evolution is change outward and upward, entropy is change inward and downward.
Information
Time
Mistakes and misuses in textbooks for general science requirements (e.g., entropy equivalent to disorder). Use of spatial arrangements to explain permutations and combinations of energy states.
Information theory, signal/image processing, data storage. Same terms (e.g., entropy) but with different meanings. More conflations with disorder and loss of information.
Postmodernist arguments.
Science is a social construct. Evolution & thermodynamics; constructed separately, now in direct conflict.
Most of the conceptual problems begin at a more basic level, prior to thermodynamics. Revisit some fundamental facts and concepts.
From the perspectives of classical thermodynamics and engineering, energy is divided up into:
(i.e., the mechanical work done on a system that will produce the same change in temperature).
Empirical temperature (degrees of heat) is what is read from a thermometer. Heat flows spontaneously from higher temperatures to lower temperatures. Objects at the same temperature are in thermal equilibrium.
Entropy increases in spontaneous transfers of heat. The entire subject built on empirical measurements without the need to know microscopic details. Its generality, without the need to know microscopic details, is its greatest strength. But not addressing the microscopic details is also its major weakness.
From the perspective of modern physics, it all comes down to particles and fields. Kinetic Energy is contained in the motion of particles.
Potential Energy is stored in the fields with which the particles interact.
This fundamental fact of nature is omitted or misrepresented in most of the misconceptions about the 2nd law of thermodynamics. These interactions occur everywhere and at every level of complexity in the universe.
Quarks and gluons => nucleons (protons, neutrons) Nuclear stickiness (nuclei cooked in stars & supernovae) Nucleons and electrons => all the atoms in the periodic table Atoms + atoms => molecules, compounds Atoms and molecules => liquids, solids, alloys Wetting, capillary action, fogging, friction Polymer chain and membrane folding Osmosis, ion pumping Hydrogen, ionic, and covalent bonds; Van der Waals forces Superfluidity and superconductivity Bose-Einstein and Fermi-Dirac interactions Gravitational stickiness
Gravitation Solids, liquids; the air molecules you breathe Friction (just try walking without any) Wetting or fogging of glasses; meniscus Everything you see, smell, taste, feel, hear The cohesion of your very selves Isolated systems of matter are extremely rare
The stickiness of matter involves the concept of falling into a well and remaining there. Energy has to be shed. (Fundamental !!!)
Harmonic Oscillator
Total Energy = Kinetic Energy + Potential Energy
Energy
Total Energy
Kinetic Energy
Potential Energy
distance
Energy
Total Energy
Kinetic Energy
Deformation Potential Energy Gravitational Potential Energy
height
Kinetic Energy
A Captured Particle
Potential Energy
distance
Remember: Radiation
Remember: Convection
Remember: Conduction
Energy
Energy
New phenomena emerge that are not characteristic of the individual particles. These new phenomena influence the further evolution of increasingly complex systems. The new phenomena, or the way a complex system interacts with other systems, often become the basis for describing a complex system. Higher level phenomena become blends or summaries of lower level phenomena.
The various ways a particle or an assembly of particles can translate, rotate, and/or vibrate. Translation; for example in 3-D space
Temperature turns out to be proportional to the Average Kinetic Energy per Degree of Freedom
(classical limit)
Explains why heat flows from high-T to low-T Historical evolution of temperature and energy scales were independent of each other; need a new conversion constant.
kB is Boltzmanns constant (8.617 x 10-5 eV/K) T is the absolute temperature (e.g., in Kelvin).
For quantum mechanical systems, the energy of each degree of freedom is quantized.
Concept: Entropy
Rudolph Clausius, in 1865, gives a name to a reoccurring mathematical expression. , a transformation (Clausiuss translation?)
Clausius modified it to entropy to associate S with the concept of energy and give the word a similar sound. Its deeper meaning requires an understanding of the microscopic nature of the empirical temperature and the internal energy of a thermodynamic system.
Continuous summation
Tlow
+ Q/Tlow (entropy increase)
Heat flows spontaneously from high temperature to low. Decrease in Shigh < Increase in Slow ( same Q, lower T ) In such a spontaneous process, overall entropy increases.
(Well, duh! Given the definition, what did you expect?)
You keep track of miles traveled. You also keep track of gallons and cost of gas.
You know that the more miles you travel, the more gas you use and the more it costs. Suppose you divide miles by gallons.
From A Source Book in Physics, Edited by William Francis Magie, Harvard University Press, 1963, page 234
If we wish to designate S by a proper name we can say of it that it is the transformation content of the body, in the same way that we say of the quantity U that it is the heat and work content of the body. However, since I think it is better to take the names of such quantities as these, which are important for science, from the ancient languages, so that they can be introduced without change into all the modern languages, I propose to name the magnitude S the entropy of the body, from the Greek word , a transformation. I have intentionally formed the word entropy so as to be as similar as possible to the word energy, since both these quantities, which are to be known by these names, are so nearly related to each other in their physical significance that a certain similarity in their names seemed to me advantageous.
Rudolph Clausius in Annalen der Physik und Chemie, Vol. 125, p. 353, 1865, under the title Ueber verschiedene fr de Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wrmetheorie. ("On Several Convenient Forms of the Fundamental Equations of the Mechanical Theory of Heat.")
Clausius is referring to a working concept about nature that already has some interesting properties, and he gives it a proper name that can be used in any language. He is not using exegesis or the etymology of a Greek word to extract its proper meaning and then insinuating that this meaning explains how nature works.
Entropy refers to the number of accessible microstates that is consistent with the macroscopic state of a thermodynamic system. Macroscopic state: specified by empirical temperature, pressure, total energy, number of particles, volume, magnetization, ...
Accessible microstate: one of the ways the total energy of the system can be distributed among all those quantized degrees of freedom.
10
df1
df2 df3
3 0 0 2 2 1 0 1 0 1
0 3 0 1 0 2 2 0 1 1 0 0 3 0 1 0 1 2 2 1
S = kB ln
(isolated system)
More generally: S = - kB i Pi ln Pi
Pi = probability of the ith microstate When the system is isolated, all Pi become 1/ and S is maximized (because matter interacts)
1/T = S/E
and
X/T = S/x
It is spread, in various combinations, among all those quantized degrees of freedom of all the molecules or other constituents from which the system is constructed (accessible microstates).
The entropy tells us how many of these accessible microstates there are for a given macroscopic state. And that will be the most probable number (because matter interacts with matter).
The Carnot cycle is the extreme upper limit for the efficiencies of all possible heat engine cycles. There is always a portion of a heat engine cycle in which work has to be done against the working medium at temperatures below Tsource. So ultimately, the efficiency of a heat engine depends on the difference in pressure on either side of the piston.
Back pressure of a working medium on the output side reduces efficiency. Most other practical cycles also throw away matter.
Tsource
Work Output
Tsink
Construction Material
68 83 85 91
34 41 43 46
Tungsten Carbon
3680 4100
6164 6920
92 93
46 47
What does thermodynamics, especially the 2nd law, have to do with evolution and life?
Quarks & Gluons: ~ billions of eV (GeV, 109 eV) Nuclei: ~ millions of eV (MeV, 106 eV) Inner electron shells of atoms: ~ keV (~103 eV) Hydrogen atom: 13.6 eV (~ 10 eV) Chemical bonds: on the order of 1 to 5 eV Solids (Al, Cu, Fe, W): on the order of 0.1 to 0.4 eV Liquid water: on the order of 0.01 to 0.04 eV Life exists within the energy range of liquid water. (neuron action potentials: approx. 0.07 eV).
Potential wells become increasingly shallow and enormously more complex. Increasingly complex phenomena emerge from the energy-driven, collective behaviors of these systems at each level. Quantum mechanics and emergent interactions set the stage for each level of complexity. No program drives whatever emerges at each stage; nothing is planned ahead.
Matter condenses into increasingly complex, organized systems all by itself. Energy must be released for matter to condense; and it is. (Fundamental!) Energy flows are necessary for matter to explore wide ranges of complexity, and to drive collective organizational behaviors. There are no known barriers to energy-driven organization and complexity. But one doesnt play a violin with a chainsaw.
The Second Law and the Emergence of Shallow-Well Systems (e.g., life)
Delicate (shallow-well) systems are most likely formed in energy cascades. Such a system can survive by being shuttled into a less energetic environment where it can stabilize, anneal, and evolve further.
Further evolution builds on the underlying template formed at higher energies.
The evolution of complex living systems is not a trend toward lower entropy systems. There is no general correlation between entropy and an organisms physical structure, or its place within an evolutionary chain. For example, entropy increases with volume and number of particles.
Lower Entropy is Not Synonymous With Greater Intelligence or Greater Organization or Order
An amoeba has lower entropy (fewer accessible microstates) than a human. Is the amoeba more advanced? As animals grow to adulthood, their total volume, energy content, and number of accessible microstates (entropy) increase enormously. Are they now stupider?
Evolution is not an upward climb against entropy; entropy is not a barrier. Evolution is energy-driven matter exploring increasingly complex arrays of increasingly shallow potential energy wells. Matter snapping into those wells requires the shedding of energy. The 2nd law is required for evolution to occur.
Misconception
2nd Law Entropy
Everything descends spontaneously into chaos
Reality
Energy spreads out because matter interacts
Amount of disorder or Refers to the chaos; lack of number of information accessible energy microstates Spontaneous molecular chaos Interacts and condenses
Impossible without Emerges from the guiding intelligence or interactions of program matter
Information
JUNK SCIENCE
Thermodynamic Arrow of Time (Second Law) (Increasing Entropy)
Time
In Summary
The End