The old Google Groups will be going away soon, but your browser is incompatible with the new version.
Message from discussion Flood dating discrepancies

From:
To:
Cc:
Followup To:
Subject:
 Validation: For verification purposes please type the characters you see in the picture below or the numbers you hear by clicking the accessibility icon.

More options Oct 21 2001, 10:30 pm
Newsgroups: talk.origins
From: Steve Carlip <sjcar...@ucdavis.edu>
Date: 21 Oct 2001 22:25:28 -0400
Local: Sun, Oct 21 2001 10:25 pm
Subject: Re: Flood dating discrepancies

Jon Fleming <j...@fleming-nospam.com> wrote:
> However, if Setterfield is right and radioactive decay rates
> were high enough in the past to explain SN1987A, Adam
> and Eve and most of their descendants would have been
> toast due to the increased heat of the Sun.

they'd be icicles.  Let me explain:

It's a little tricky to try to extract conclusions from Setterfield,
because he talks about a bunch of different quantities varying
in different ways (not always self-consistently), and it's hard
to keep track of how they interact.  For example, if radioactive
decay rates were faster in the past but everything else was as
well, you could adjust things so that there was absolutely no
physical effect.  That's why it's important to focus on what
happens to the fundamental dimensionless constants, which
give a precise measure of changes.

Setterfield's current proposal, as gleaned from his Web site,
seems to be the following:
1.  The dimensionless coupling constants associated with non-
gravitational interactions -- in particular, the fine structure
constant a, the electroweak coupling g, and the ``strong
interaction fine structure constant'' a_s -- do not change
with time.  The combination hc, where h is Planck's constant,
also does not change with time.
2.  ``Atomic'' quantities with dimensions of time vary inversely
1/c, so decay rates go as c.
3.  Elementary particle masses vary inversely with c^2.
4.  Newton's constant G varies proportionally to c^2, so, for
example, Gm_p remains constant (where m_p is the mass
of the proton).  Combining this with (3), the ``gravitational
fine structure constant'' a_G = G(M_p)^2/hc varies inversely
with c^2.  This is the one dimensionless constant that changes
with time.

These assumptions mean that any process that takes place at a
fixed time and depend only on electromagnetism and atomic
and subatomic physics cannot be used to detect any change.
Neither can any purely gravitational process.  Only processes
that involve both gravity and some other physical interaction
are sensitive to the changes Setterfield proposals.

Unfortunately for Setterfield, stars involve such a balance, and
can be used as tests.  Stars are (approximately) stable because
they balance gravitational attraction with gas pressure, which
depends on atomic processes, and they produce energy because
of nuclear processes.  Equilibrium is determined by requiring
that the pressure just balance the gravitational attraction.  This
fixes the central temperature; if it is hot enough to start fusion,
you'll have a star.  The main barrier to fusion is actually not
nuclear, but electromagnetic -- protons repel each other, and
the central pressure and temperature have to be high enough
to overcome this ``Coulomb barrier'' to get nuclei close enough
to fuse.

A nice source of information about how these processes depend
on fundamental constants is Barrow and Tipler's book, _The
Anthropic Cosmological Principle_, especially chapter 5.  One
may argue with the authors' philosophical conclusions -- many
have -- but their description of the basic physics is fine, and
is clearly explained.

Here's the upshot.  The minimum number N of nucleons (protons
and neutrons) needed for a star to ``ignite'' goes as (a/a_G)^{3/2}.
From point (1), Setterfield's a is constant; from point (4), his
a_G goes as 1/c^2.  Hence his N goes as c^3.  The present value
of N is a few percent of the number of nucleons in the Sun.  Thus
in Setterfield's model, an increase in c by a factor of a little more
than 2 will turn off the Sun.  Setterfield has various ``fits'' of the
rate of change of c, but by my reading this would have been about
1000 years ago.  I think it would have been noticed.

You get an even more dramatic dependence on c if you ask about
the energy output, or luminosity, of a star.  This goes as (a_G)^4
(see Barrow and Tipler, section 5.6), or, in Setterfield's model,
c^{-8}.  A 10% change in c would thus cut the luminosity of the
Sun by a factor of two.  Setterfield would have this happen in the
past 600 or 700 years, I think.  Again, it would have been noticed.

You run into the same sort of problem if you look at the sizes of
planets.  A planet is in equilibrium when gravitational attraction,
which changes in time according to Setterfield, balances repulsive
forces, which don't.  The radius of a planet depends on quantities
that vary, in Setterfield's model, as (a_G)^{-1/2}(m_e)^{-1},
where m_e is the mass of an electron (see Barrow and Tipler,
section 5.3).  This varies, according to Setterfield, as c^3.  Thus
a 10% decrease in the speed of light -- in the past 700 years! --
would have meant about a 30% decrease in the radius of the Earth.
Again, this would have been noticed.

Now, as I said, Setterfield proposes a bunch of different possible
curves for c against time, and the dates I've cited are based on a
quick reading of one of them.  But if you want to fit the history
of the Universe into 10,000 years, you're stuck -- you *need*
to have a speed of light that was, in historical times, much larger
than it is now.  You thus need to explain why nobody noticed the
huge earthquakes as the Earth shrunk, why nobody pointed out
how cold it was with such a dim Sun, and why nobody wrote
down a startled account of the Sun first igniting.  And you have
to explain why we see so many Sun-like stars, when at the time
the light we see left them c was too high for such stars to exist.

Steve Carlip