[A law] is more impressive the greater the simplicity of its
premises, the more different are the kinds of things it
relates, and the more extended its range of
applicability. Therefore, the deep impression which classical
thermodynamics made on me. It is the only physical theory of
universal content, which I am convinced, that within the
framework of applicability of its basic concepts will never be
overthrown.
Albert
Einstein,
quoted in M.J. Klein, Thermodynamics in Einstein's Universe,
in Science, 157 (1967), p. 509.
The
law that entropy always increases -- the second law of
thermodynamics -- holds I think, the supreme position among
the laws of Nature. If someone points out to you that your pet
theory of the universe is in disagreement with Maxwell's
equations - then so much worse for Maxwell equations. If it is
found to be contradicted by observation - well these
experimentalists do bungle things sometimes. But if your
theory is found to be against the second law of
Thermodynamics, I can give you no hope; there is nothing for
it but to collapse in deepest humiliation.
Sir
Arthur Stanley Eddington,
in The Nature of the Physical World. Maxmillan, New
York, 1948, p. 74.
I suppose Einstein's friends told him he would be better off
not inventing such new uses of terms and sticking with
conventional frameworks, or people wouldn't understand
discount the work...nonetheless, Einstein sitting at the cafe
imagining himself riding a lightning bolt, realized a
relationship to time that no one else saw...
This page provides a good primer,
http://en.wikipedia.org/wiki/Entropy
On how
I got where I got with my own ideas about what is
occurring in our systems, and how to boil a generation down
into a usable tool...albeit using the term in a way that may
not have been used before...
I hope someday I can find the emails I sent to the professor
emeritus who is/was responsible for http://secondlaw.oxy.edu/.
As I understand it, it took Einstein years to figure out what
his insight that day created, but eventually things fell into
place...over time.
There is only one problem as i see it...
I'm not Einstein, duh.
However, my intuition has given me something that I think will
be figured out as revealing a new way to think about motive
force, which we all know is energy and information, in space
and time...
My reference to entropy is not necessarily a traditional
idea...however:
French mathematician Lazare
Carnot who in his 1803 paper Fundamental
Principles of Equilibrium and Movement proposed that in
any machine the accelerations and shocks of the moving parts
represent losses of moment of activity. In other words, in
any natural process there exists an inherent tendency towards
the dissipation of useful energy.
Entropy is the only quantity in the physical sciences that
seems to imply a particular direction of progress, sometimes
called an arrow
of time.
As time progresses, the second law of thermodynamics states
that the entropy of anisolated
system never
decreases. Hence, from this perspective, entropy measurement
is thought of as a kind of clock.
Entropy is equally essential in predicting the extent and
direction of complex chemical reactions. For such
applications, ”S must be incorporated in an expression
that includes both the system and its surroundings, ”Suniverse =
”Ssurroundings + ”S system.
This expression becomes, via some steps, the Gibbs
free energy equation
for reactants and products in the system: ”G [the
Gibbs free energy change of the system] = ”H [the
enthalpy change] ˆ’T ”S [the entropy change].[31]
Von Neumann established a rigorous mathematical framework for
quantum mechanics with his work Mathematische Grundlagen
der Quantenmechanik. He provided in this work a theory of
measurement, where the usual notion of wave
function collapse is described as an irreversible
process (the so called von Neumann or
projective measurement). Using this concept,
in conjunction with the density
matrix he
extended the classical concept of entropy into the quantum
domain.
The concept of entropy can be described qualitatively as a
measure of energy dispersal at a specific temperature.[43] Similar
terms have been in use from early in the history of classical
thermodynamics,
and with the development ofstatistical
thermodynamics and quantum
theory,
entropy changes have been described in terms of the mixing or
"spreading" of the total energy of each constituent of a
system over its particular quantized energy levels.
Ambiguities in the terms disorder and chaos,
which usually have meanings directly opposed to equilibrium,
contribute to widespread confusion and hamper comprehension of
entropy for most students.[44] As
the second
law of thermodynamics shows,
in an isolated
system internal
portions at different temperatures will tend to adjust to a
single uniform temperature and thus produce equilibrium. A
recently developed educational approach avoids ambiguous terms
and describes such spreading out of energy as dispersal, which
leads to loss of the differentials required for work even
though the total energy remains constant in accordance with
the first
law of thermodynamics[45](compare
discussion in next section). Physical chemist Peter
Atkins,
for example, who previously wrote of dispersal leading to a
disordered state, now writes that "spontaneous changes are
always accompanied by a dispersal of energy".[30][46]
Following on from the above, it is possible (in a thermal
context) to regard entropy as an indicator or measure of the effectiveness or usefulness of
a particular quantity of energy.[47]
The second law of thermodynamics says that an increase in
entropy is favored...
The following is a list of additional definitions of entropy
from a collection of textbooks:
-
a
measure of energy
dispersal at
a specific temperature.[30]
-
a measure of disorder in the universe or of the
availability of the energy in a system to do work.[51]
Free entropy “
an entropic thermodynamic potential analogous to the free
energy.
In a study titled "Natural selection for least action"
published in the Proceedings of The Royal Society A.,
Ville Kaila and Arto Annila of the University
of Helsinkidescribe
how the second law of thermodynamics can be written as an
equation of motion to describe evolution, showing how natural
selection and the principle of least action can be connected
by expressing natural selection in terms of chemical
thermodynamics. In this view, evolution explores possible
paths to level differences in energy densities and so increase
entropy most rapidly.Thus, an organism serves as an energy
transfer mechanism, and beneficial mutations allow successive
organisms to transfer more energy within their environment.[61]
After all of this and suffering from the lack of an Einstein
brain...
What became clear to me as I creatively synthesized (cynthesis
is the term i coined a decade ago)...many systems, stripping
away literal interpretations but using them as bread crumbs,
it flashed clear to me, that what was happening when we were
on a Path @F-L-O-W, perhaps being in flow states, or
experiences, @F-L-O-W Levels...was that the energy and
information, along with space-time was "reordering the system
as an altered state" and the excess energy and information
actually was not usuable in that state...but it was
"available" to do other things, such as "exchange" with our
environments, and that environment includes our nutritional,
phenomenological, ontological and epistemelogical ones, as
well.
In 1982, American biochemist Albert
Lehninger argued
that the "order" produced within cells as they grow and divide
is more than compensated for by the "disorder" they create in
their surroundings in the course of growth and division.
"Living organisms preserve their internal order by taking from
their surroundings free
energy,
in the form of nutrients or sunlight, and returning to their
surroundings an equal amount of energy as heat and
entropy."[59]
What I realized was that we are nothing more than a vehicle in
a quantum state of interchange...
Main article: Entropy
and life
For
nearly a century and a half, beginning with Clausius' 1863
memoir "On the Concentration of Rays of Heat and Light, and on
the Limits of its Action", much writing and research has been
devoted to the relationship between thermodynamic entropy and
the evolution of life.
The argument that life feeds on negative entropy or negentropy was
asserted by physicist Erwin
Schrödinger in
a 1944 book What
is Life?.
He posed, "How does the living organism avoid decay?" The
obvious answer is: "By eating, drinking, breathing and (in the
case of plants) assimilating." Recent writings have used the
concept of Gibbs
free energyto
elaborate on this issue.[56] While
energy from nutrients is necessary to sustain an organism's
order, there is also the Schradinger prescience: "An
organism's astonishing gift of concentrating a stream of order
on itself and thus escaping the decay into atomic chaos “ of
drinking orderliness from a suitable environment “ seems
to be connected with the presence of the aperiodic solids..."
We now know that the 'aperiodic' crystal is DNA and that the
irregular arrangement is a form of information. "The DNA in
the cell nucleus contains the master copy of the software, in
duplicate. This software seems to control by "specifying an
algorithm, or set of instructions, for creating and
maintaining the entire organism containing the cell."[57] DNA
and other macromolecules determine an organism's life cycle:
birth, growth, maturity, decline, and death. Nutrition is
necessary but not sufficient to account for growth in size as
genetics is the governing factor. At some point, organisms
normally decline and die even while remaining in environments
that contain sufficient nutrients to sustain life. The
controlling factor must be internal and not nutrients or
sunlight acting as causal exogenous variables. Organisms
inherit the ability to create unique and complex biological
structures; it is unlikely for those capabilities to be
reinvented or be taught each generation. Therefore DNA must
be operative as the prime cause in this characteristic as
well. Applying Boltzmann's perspective of the second law,
the change of state from a more probable, less ordered and
high entropy arrangement to one of less probability, more
order, and lower entropy seen in biological ordering calls for
a function like that known of DNA. DNA's apparent
information processing function provides a resolution of the
paradox posed by life and the entropy requirement of the
second law.[58]
Because I know what I know before I know how I know it, much
as Einstein proved before he could...and did...
What I saw was patterns, where entropy, was in fact, energy
and information, and that it was/is available to be uphilled
"by" the environment as an exchange and "WE" are a part of
that environmental portfolio of constituents.
So, without getting any more complex, a simple tool
emerged...(the one i expressed to the "professor" a decade
ago...)
|