Sei sulla pagina 1di 62

Order, Disorder, and Entropy: Misconceptions and Misuses of Thermodynamics

Michael B. Elzinga Kalamazoo Area Mathematics and Science Center A Presentation for Science Caf The Ohio State University at Marion October 5, 2010

Prior to About 1970: Public Impressions About Thermodynamics Were Hazy But Not Wrong

In the US, the publics (usually males) ideas about thermodynamics were approximately:

No perpetual motion machines Cant get 100% efficiency Useful energy degrades (entropy???)

The US love affair with automobiles and machinery may have had some influence.

Throughout the 1970s and 1980s New Memes Spread Rapidly

Public understanding of thermodynamics concepts took a dramatic wrong turn. No longer references to energy and efficiency. Conflations: entropy, disorder, information. Corrective attempts by science community generally unsuccessful.

What These Memes Say About the 2nd Law

Everything runs down, decays, reverts to simpler forms or states, and everything becomes disorganized into complete chaos, which is the highest state of entropy.

Messy rooms, rust and wear Tornado in a junkyard (raw energy > 747s) Aging and death, mutations, copying errors Genetic entropy (e.g., wolf to Chihuahua)

Order and information are the opposite of entropy (entropy is a measure of disorder).

What These Memes Say About the 2nd Law

Open systems are even worse than closed systems for producing order.

Sunlight shining on lumber > a house. Tornado in a junkyard increases entropy.

Thus the 2nd law is claimed to be a barrier to order and specified complexity or increased information.

What These Memes Say About the 2nd Law

Some kind of work must be done to overcome the law of entropy (the 2nd law) and climb uphill toward order and specified complexity or information (lower entropy). Functionless crystals are low energy order; living organisms are high energy order. Some kind of intelligence must do the planning.

What These Memes Say About the 2nd Law

Some kind of information or program must guide, be contained within, or flow through such a system to organize it and keep it from following the natural trend of all things in the universe toward higher entropy. Therefore, physics and chemistry cannot explain organization, complexity, functionality, and life in the universe.

This Entire Conceptual Framework is Not About Thermodynamics

It completely misses the mark about the 2nd law of thermodynamics.


It mangles the concept of entropy.

It is a misleading representation of how the universe actually works.

Primary Origin of These Memes

Creationists (from early 1960s - present).

Henry M. Morris, Duane Gish, Gary Parker, et. al. Saturation bombing of the media. Narrative: Pit the myth of evolution against the science of thermodynamics.

Intelligent design proponents (after 1987).

Dembski, Behe, et. al. and their treatments of the complex assemblies of molecules.

From Evolution, Thermodynamics, and Entropy by Henry Morris of the Institute for Creation Research (http://www.icr.org/articles/view/51/247/)

The very terms themselves [Morris is speaking of evolution and entropy] express contradictory concepts. The word "evolution" is of course derived from a Latin word meaning "out-rolling". The picture is of an outward-progressing spiral, an unrolling from an infinitesimal beginning through ever broadening circles, until finally all reality is embraced within. "Entropy," on the other hand, means literally "in-turning." It is derived from the two Greek words en (meaning "in") and trope (meaning "turning"). The concept is of something spiraling inward upon itself, exactly the opposite concept to "evolution." Evolution is change outward and upward, entropy is change inward and downward.

The Evolution vs. 2nd Law Narrative


(e.g., in What is Creation Science? by Henry M. Morris and Gary E. Parker, 1982)
Historical Arrow of Time (Evolution) (Decreasing Entropy)

Information

Thermodynamic Arrow of Time (Second Law) (Increasing Entropy)

Time

TIMES ARROWS EVOLUTION VS. SCIENCE

Other Contributions to These Memes

Well-intentioned but sloppy popularizations and metaphors.


Mistakes and misuses in textbooks for general science requirements (e.g., entropy equivalent to disorder). Use of spatial arrangements to explain permutations and combinations of energy states.

The development and rapid growth of new fields.


Information theory, signal/image processing, data storage. Same terms (e.g., entropy) but with different meanings. More conflations with disorder and loss of information.

Postmodernist arguments.

Science is a social construct. Evolution & thermodynamics; constructed separately, now in direct conflict.

Nick Downes Nails the Physicists Contributions

Rebuilding Conceptual Foundations

Most of the conceptual problems begin at a more basic level, prior to thermodynamics. Revisit some fundamental facts and concepts.

Demonstrate the use of energy diagrams.


Define temperature and entropy (with some examples).

One slide will be a technical aside; will translate.

Relate the 2nd law to evolution and living organisms.

Thermodynamics is About the Bookkeeping of Energy


(It has never been otherwise)

From the perspectives of classical thermodynamics and engineering, energy is divided up into:

Mechanical work (Force x Distance),


Internal energy within a working medium Heat (non-mechanical transfers of energy).

Thermodynamics Before Statistical Mechanics

Heat is measured in terms of mechanical equivalents of heat.

(i.e., the mechanical work done on a system that will produce the same change in temperature).

Empirical temperature (degrees of heat) is what is read from a thermometer. Heat flows spontaneously from higher temperatures to lower temperatures. Objects at the same temperature are in thermal equilibrium.

(i.e., no net heat flow between them).

Thermodynamics Before Statistical Mechanics

Entropy increases in spontaneous transfers of heat. The entire subject built on empirical measurements without the need to know microscopic details. Its generality, without the need to know microscopic details, is its greatest strength. But not addressing the microscopic details is also its major weakness.

Modern Thermodynamics After Statistical Mechanics

From the perspective of modern physics, it all comes down to particles and fields. Kinetic Energy is contained in the motion of particles.
Potential Energy is stored in the fields with which the particles interact.

Matter Interacts With Matter

This fundamental fact of nature is omitted or misrepresented in most of the misconceptions about the 2nd law of thermodynamics. These interactions occur everywhere and at every level of complexity in the universe.

(There would be no universe without them.)

Physicists and chemists study these interactions by taking matter apart.

Matter is Sticky; It Condenses


Quarks and gluons => nucleons (protons, neutrons) Nuclear stickiness (nuclei cooked in stars & supernovae) Nucleons and electrons => all the atoms in the periodic table Atoms + atoms => molecules, compounds Atoms and molecules => liquids, solids, alloys Wetting, capillary action, fogging, friction Polymer chain and membrane folding Osmosis, ion pumping Hydrogen, ionic, and covalent bonds; Van der Waals forces Superfluidity and superconductivity Bose-Einstein and Fermi-Dirac interactions Gravitational stickiness

In Fact, You Are Immersed in Matter Interactions Everywhere You Go


Gravitation Solids, liquids; the air molecules you breathe Friction (just try walking without any) Wetting or fogging of glasses; meniscus Everything you see, smell, taste, feel, hear The cohesion of your very selves Isolated systems of matter are extremely rare

Gravitational and Nuclear Stickiness

Stickiness in Very Complex Systems

The Fundamentals of Stickiness

The stickiness of matter involves the concept of falling into a well and remaining there. Energy has to be shed. (Fundamental !!!)

E.g., falling down a manhole

Splat effect dissipates energy

We will look at how energy is shed at the microscopic level.

Harmonic Oscillator
Total Energy = Kinetic Energy + Potential Energy
Energy

Total Energy

Kinetic Energy

Potential Energy
distance

An Object Falling Near Earths Surface

Total Energy = Kinetic Energy + Potential Energy

Energy

Total Energy

Kinetic Energy
Deformation Potential Energy Gravitational Potential Energy

height

Energy Barrier of an Atomic Nucleus

Incoming particle might be a proton

Total Energy of a Scattered Particle


Energy

Kinetic Energy
A Captured Particle

Potential Energy
distance

Mechanisms for Shedding Energy

Photons (quantized light particles), gravitons

Remember: Radiation

Particles (atoms, molecules, neutrinos, etc.)

Remember: Convection

Phonons (quantized sound waves in solids and liquids)

Remember: Conduction

Matter Condenses by Shedding Energy

Continuous loss (classical physics)


Total Energy Kinetic Energy
distance

Energy

Potential Energy Increasingly Bound Particle

Matter Condenses by Shedding Energy

Discrete jumps (quantum physics)


photons, or phonons,

Energy

Unbound Particle Kinetic Energy


distance

Potential Energy Bound Particles

Condensing Matter Behaves Collectively

New phenomena emerge that are not characteristic of the individual particles. These new phenomena influence the further evolution of increasingly complex systems. The new phenomena, or the way a complex system interacts with other systems, often become the basis for describing a complex system. Higher level phenomena become blends or summaries of lower level phenomena.

Concept: Degrees of Freedom

The various ways a particle or an assembly of particles can translate, rotate, and/or vibrate. Translation; for example in 3-D space

Rotation in 3-D space Vibration in various modes


+ +

Temperature turns out to be proportional to the Average Kinetic Energy per Degree of Freedom

Avg. K.E. per d.f. = kB T

(classical limit)

Explains why heat flows from high-T to low-T Historical evolution of temperature and energy scales were independent of each other; need a new conversion constant.

kB is Boltzmanns constant (8.617 x 10-5 eV/K) T is the absolute temperature (e.g., in Kelvin).

For quantum mechanical systems, the energy of each degree of freedom is quantized.

Concept: Entropy

Rudolph Clausius, in 1865, gives a name to a reoccurring mathematical expression. , a transformation (Clausiuss translation?)

(Note: Umgestaltung and not Umdrehung?)

Clausius modified it to entropy to associate S with the concept of energy and give the word a similar sound. Its deeper meaning requires an understanding of the microscopic nature of the empirical temperature and the internal energy of a thermodynamic system.

Divide the quantity of heat by T; then sum.


Tiny amount of heat transferred

Continuous summation

Divided by the temperature at which the heat is transferred

Rate of Heat Flow Proportional to (Thigh Tlow)


Thigh
-Q/Thigh
(entropy decrease)

Tlow
+ Q/Tlow (entropy increase)

Heat flows spontaneously from high temperature to low. Decrease in Shigh < Increase in Slow ( same Q, lower T ) In such a spontaneous process, overall entropy increases.
(Well, duh! Given the definition, what did you expect?)

This is not really as goofy as it sounds.

Analogy: Suppose you travel a lot in your car.

You keep track of miles traveled. You also keep track of gallons and cost of gas.

You know that the more miles you travel, the more gas you use and the more it costs. Suppose you divide miles by gallons.

Hmm, a useful perspective.

Finally someone says, I propose to call this quantity the mileage.

From A Source Book in Physics, Edited by William Francis Magie, Harvard University Press, 1963, page 234

If we wish to designate S by a proper name we can say of it that it is the transformation content of the body, in the same way that we say of the quantity U that it is the heat and work content of the body. However, since I think it is better to take the names of such quantities as these, which are important for science, from the ancient languages, so that they can be introduced without change into all the modern languages, I propose to name the magnitude S the entropy of the body, from the Greek word , a transformation. I have intentionally formed the word entropy so as to be as similar as possible to the word energy, since both these quantities, which are to be known by these names, are so nearly related to each other in their physical significance that a certain similarity in their names seemed to me advantageous.

Rudolph Clausius in Annalen der Physik und Chemie, Vol. 125, p. 353, 1865, under the title Ueber verschiedene fr de Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wrmetheorie. ("On Several Convenient Forms of the Fundamental Equations of the Mechanical Theory of Heat.")

Footnote: Contrasting Science with Pseudo-science

Clausius is referring to a working concept about nature that already has some interesting properties, and he gives it a proper name that can be used in any language. He is not using exegesis or the etymology of a Greek word to extract its proper meaning and then insinuating that this meaning explains how nature works.

Entropy after Statistical Mechanics

Entropy refers to the number of accessible microstates that is consistent with the macroscopic state of a thermodynamic system. Macroscopic state: specified by empirical temperature, pressure, total energy, number of particles, volume, magnetization, ...

Accessible microstate: one of the ways the total energy of the system can be distributed among all those quantized degrees of freedom.

Example: Accessible Microstates


Total Energy = 3 units
( Test questions: How about 1 unit? How about 0 units? )

10

df1
df2 df3

3 0 0 2 2 1 0 1 0 1
0 3 0 1 0 2 2 0 1 1 0 0 3 0 1 0 1 2 2 1

A Slightly Technical Aside


(Mainly for the math/science/engineering audience)

S = kB ln

(isolated system)

= number of accessible microstates

More generally: S = - kB i Pi ln Pi

Pi = probability of the ith microstate When the system is isolated, all Pi become 1/ and S is maximized (because matter interacts)

1/T = S/E

and

X/T = S/x

Also partition functions (i.e., lnZ/x )

Internal Energy of a System

This is the total energy contained within a macroscopic system.

It is spread, in various combinations, among all those quantized degrees of freedom of all the molecules or other constituents from which the system is constructed (accessible microstates).

The entropy tells us how many of these accessible microstates there are for a given macroscopic state. And that will be the most probable number (because matter interacts with matter).

Application: Making Ice Cream (Adiabatic Cooling)

Salt breaks ice bonds more degrees of freedom


Large bath of crushed ice + salt More degrees of freedom, same total energy

Ice cream mixture

Insulating Container Thermally Conducting Container

Less energy per df lower temperature

Application: Basic Heat Engine

The Essentials of Converting Heat into Work. Heat Source at Tsource


p Carnot cycle

Basic Heat Engine


Work Output

Heat Sink at Tsink

Carnot Efficiency = 1 Tsink / Tsource

The Carnot cycle is the extreme upper limit for the efficiencies of all possible heat engine cycles. There is always a portion of a heat engine cycle in which work has to be done against the working medium at temperatures below Tsource. So ultimately, the efficiency of a heat engine depends on the difference in pressure on either side of the piston.

Carnot Efficiency = 1 Tsink / Tsource


Back pressure of a working medium on the output side reduces efficiency. Most other practical cycles also throw away matter.

Tsource

Work Output

Tsink

Theoretical & Realistic Heat Engine Efficiencies


Suppose: Discharge Temperature, Tdisch = 300 K (80 F)

Construction Material

Melting Temperature Tmelt


K ( F )

Maximum Theoretical Efficiency %


(1 Tdisch / Tmelt) x 100

More Realistic Efficiency %


( x 1/2 )

Aluminum Iron Titanium Jet Fuel

933 1809 1943 3273

1220 2797 3038 5432

68 83 85 91

34 41 43 46

Tungsten Carbon

3680 4100

6164 6920

92 93

46 47

Application: Friction Heating

Life in the Context of the Second Law

What does thermodynamics, especially the 2nd law, have to do with evolution and life?

Matter is sticky; it interacts and condenses.


Lets look at some binding energy benchmarks.

Some Approximate Binding Energy Benchmarks, in Electron Volts (eV)


Quarks & Gluons: ~ billions of eV (GeV, 109 eV) Nuclei: ~ millions of eV (MeV, 106 eV) Inner electron shells of atoms: ~ keV (~103 eV) Hydrogen atom: 13.6 eV (~ 10 eV) Chemical bonds: on the order of 1 to 5 eV Solids (Al, Cu, Fe, W): on the order of 0.1 to 0.4 eV Liquid water: on the order of 0.01 to 0.04 eV Life exists within the energy range of liquid water. (neuron action potentials: approx. 0.07 eV).

Binding Energies Decrease with Increasing Complexity

Potential wells become increasingly shallow and enormously more complex. Increasingly complex phenomena emerge from the energy-driven, collective behaviors of these systems at each level. Quantum mechanics and emergent interactions set the stage for each level of complexity. No program drives whatever emerges at each stage; nothing is planned ahead.

The Second Law is Not About Everything Falling All Apart


Matter condenses into increasingly complex, organized systems all by itself. Energy must be released for matter to condense; and it is. (Fundamental!) Energy flows are necessary for matter to explore wide ranges of complexity, and to drive collective organizational behaviors. There are no known barriers to energy-driven organization and complexity. But one doesnt play a violin with a chainsaw.

The Second Law and the Emergence of Shallow-Well Systems (e.g., life)

Delicate (shallow-well) systems are most likely formed in energy cascades. Such a system can survive by being shuttled into a less energetic environment where it can stabilize, anneal, and evolve further.
Further evolution builds on the underlying template formed at higher energies.

Living Systems Are Not Synonymous With Low Entropy Systems

The evolution of complex living systems is not a trend toward lower entropy systems. There is no general correlation between entropy and an organisms physical structure, or its place within an evolutionary chain. For example, entropy increases with volume and number of particles.

Lower Entropy is Not Synonymous With Greater Intelligence or Greater Organization or Order

An amoeba has lower entropy (fewer accessible microstates) than a human. Is the amoeba more advanced? As animals grow to adulthood, their total volume, energy content, and number of accessible microstates (entropy) increase enormously. Are they now stupider?

The 2nd Law is Required for Matter to Condense

Evolution is not an upward climb against entropy; entropy is not a barrier. Evolution is energy-driven matter exploring increasingly complex arrays of increasingly shallow potential energy wells. Matter snapping into those wells requires the shedding of energy. The 2nd law is required for evolution to occur.

Misconception
2nd Law Entropy
Everything descends spontaneously into chaos

Reality
Energy spreads out because matter interacts

Amount of disorder or Refers to the chaos; lack of number of information accessible energy microstates Spontaneous molecular chaos Interacts and condenses

Matter Evolution, Complexity

Impossible without Emerges from the guiding intelligence or interactions of program matter

The Evolution vs. 2nd Law Narrative


Historical Arrow of Time (Evolution) (Decreasing Entropy)

Information

JUNK SCIENCE
Thermodynamic Arrow of Time (Second Law) (Increasing Entropy)

Time

TIMES ARROWS EVOLUTION VS. SCIENCE

In Summary

Thermodynamics is about the bookkeeping of energy.

Entropy counts accessible energy microstates.


Matter is sticky at every level of complexity. Stickiness requires energy to be released.

The 2nd law is an intrinsic part of matter interactions.


Evolution explores every available energy state.

Evolution Explores All Available Energy States

The End