Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Good News for Big Bang theory

7 views
Skip to first unread message

John (Liberty) Bell

unread,
Oct 11, 2006, 3:39:35 AM10/11/06
to
The following is a copy of my response to a BBC programme, which made
some exciting (if true) claims about the recent successes of Big Bang
theory. Since these claims went well beyond my own knowledge in this
area, I would appreciate any relevant comments from anyone.

I have just seen the last part of a BBC 2 Horizon programme (broadcast
in Britain late in the morning on 10 Oct. 2006), which relates to a
recent discussion at sci.astro.research. This made several surprising
(to me) assertions, which included:

1) Big Bang theory did, in fact, predict the total mass of the
universe.
[In this context, it was explained, within the programme, (a) that the
total mass of atoms and molecules (by which I presume they meant
baryonic matter) only turns out to account for ~ 4% of that Big Bang
predicted total, and (b) subsequent observations established that
'dark matter' only contributes a further 21% to that predicted
total.]
2) Following the unexpected discovery of an accelerating expansion of
the universe (which I presume refers to the findings of the
multinational high-z supernova search team), the resultant postulated
'dark energy' turned out, remarkably, to have a required equivalent
mass that is precisely that extra 75% that had already been predicted
by Big Bang theory.
3) The resultant now "Standard Model" of astronomy has since been
run on computer simulation and found to model the dynamics, and
observable results of the evolution of the universe remarkably closely.

In the context of (1), since that assertion flatly contradicts prior
comments at sci.astro.research[1], I would appreciate appropriate
references and/or comments.

In the context of (2), this implies that the observed rate of
acceleration of the expansion of the universe is known quite
accurately, which is, again, news to me. Consequently, I would
appreciate the relevant figures (and refs), along with relevant
references for how this is translated into the appropriate required
(repulsive) equivalent mass.

Perhaps of less fundamental importance, in the context of (3), I would
also appreciate clarification of whether that simulation reproduced the
unexpectedly high proportion of very large and old galaxies observed by
the Gemini Deep Deep Survey[2], between 3 and 6 Gyr after the Big Bang,
and the subsequently observed populations of galaxies over the range
0=2E7 to 0.9 Gyr.[3]

Since I have found in the past that Horizon programmes are generally
informative and, at worst, a little na=EFve in those areas where I am
reasonably well read, I find it difficult to believe that the BBC made
this whole thing up. Consequently I am sending copies of these
questions to sci.physics.research and sci.astro.research as well as the
BBC, to find out if such exuberant claims for the successes of standard
theory can, in fact, be substantiated.

References:
[1] most recent parts of
http://groups.google.com/group/sci.astro.research/browse_frm/thread/678c232=
d463567db

[2] http://www.gemini.edu/project/announcements/press/2004-1.html
http://www.ociw.edu/lcirs/gdds.html
http://www.gemini.edu/media/images_2004-1.html
[3] http://www.ucsc.edu/news_events/press_releases/text.asp?pid=3D939


John Bell
(Change John to Liberty to bypass anti-spam email filter)

Oh No

unread,
Oct 11, 2006, 10:02:43 AM10/11/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>

>
>
>1) Big Bang theory did, in fact, predict the total mass of the
>universe.

Not really. Firstly what is predicted is density, not total mass.
Secondly, to make a prediction about mass density one requires
supplemental assumptions or observations on curvature and Lambda. Big
bang models have three cosmological parameters, Omega (mass density),
Omega_k (curvature) and Omega_Lambda (cosmological constant) subject to
the constraint that

Omega + Omega_k + Omega_Lambda = 1.

Omega = 1 is critical density for a flat, no Lambda cosmology. Possibly
this identity sums up the "prediction" to which Horizon referred, but I
would not describe this as a prediction of density, Omega.

Observations appear consistent with Omega_k~0 (a flat universe),
0mega~28%, Omega_Lambda~72%, or thereabouts.

>[In this context, it was explained, within the programme, (a) that the
>total mass of atoms and molecules (by which I presume they meant
>baryonic matter) only turns out to account for ~ 4% of that Big Bang
>predicted total, and (b) subsequent observations established that
>'dark matter' only contributes a further 21% to that predicted
>total.]

These are roughly the observed figures compared to Omega=1.

>2) Following the unexpected discovery of an accelerating expansion of
>the universe (which I presume refers to the findings of the
>multinational high-z supernova search team),

This is the principle discovery, although there are a number of
supporting observations.

>the resultant postulated
>'dark energy' turned out, remarkably, to have a required equivalent
>mass that is precisely that extra 75% that had already been predicted
>by Big Bang theory.

Yes, the above relation is satisfied for a flat cosmology. As you know,
however, I think there is a misinterpretation of red shift in the
standard model. The teleconnection gives a better fit to SN data with
Omega=1.89, Omega_k=-0.89 (closed finite universe) and Omega_Lambda=0
(no cosmological constant or dark energy).

>3) The resultant now "Standard Model" of astronomy has since been
>run on computer simulation and found to model the dynamics, and
>observable results of the evolution of the universe remarkably closely.

Not entirely.


>
>In the context of (1), since that assertion flatly contradicts prior
>comments at sci.astro.research[1], I would appreciate appropriate
>references and/or comments.

I would refer you to two review papers in Nature:

Glazebrook K. et. al., 2004, Nature, 430, 181-184.
http://www.pha.jhu.edu/~kgb/MiscPub/gdds-paperiii-nature.pdf

Cimatti. et. al., 2004, Old Galaxies in the Young Universe, Nature, 430,
184-188. astro-ph/0407131

As described by Glazebrook (2004), there is poor agreement between
current theoretical models of galaxy evolution and empirical data. To
explain this it has been suggested (e.g. Cimatti et. al, 2004) that the
theoretical models may be inaccurate.


>
>In the context of (2), this implies that the observed rate of
>acceleration of the expansion of the universe is known quite
>accurately, which is, again, news to me. Consequently, I would
>appreciate the relevant figures (and refs), along with relevant
>references for how this is translated into the appropriate required
>(repulsive) equivalent mass.
>
>Perhaps of less fundamental importance, in the context of (3), I would
>also appreciate clarification of whether that simulation reproduced the
>unexpectedly high proportion of very large and old galaxies observed by
>the Gemini Deep Deep Survey[2], between 3 and 6 Gyr after the Big Bang,
>and the subsequently observed populations of galaxies over the range
>0=2E7 to 0.9 Gyr.[3]

I think not. At least some of the authors of those papers were involved
with Gemini.


>
>Since I have found in the past that Horizon programmes are generally
>informative and, at worst, a little na=EFve in those areas where I am
>reasonably well read, I find it difficult to believe that the BBC made
>this whole thing up.

Not made up. Perhaps a little distorted. The general attitude to galaxy
evolution is that data is preliminary and evolution models are
questionable. No one is taking them too seriously just yet.


Regards

--
Charles Francis
substitute charles for NotI to email

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 11, 2006, 2:40:56 PM10/11/06
to
In article <mt2.0-22539...@hercules.herts.ac.uk>, "John
(Liberty) Bell" <john...@accelerators.co.uk> writes:

> The following is a copy of my response to a BBC programme, which made
> some exciting (if true) claims about the recent successes of Big Bang
> theory. Since these claims went well beyond my own knowledge in this
> area, I would appreciate any relevant comments from anyone.

In general, be aware that television science is not always completely
accurate. Maybe the presenter doesn't know better, maybe he does but
dumbs it down. Also, be aware that "big bang theory" means different
things to different people. Actually, it means "the universe is
expanding from a previous state which was much smaller and much denser";
many folks add one or more additional things to this definition. Here
are my guesses as to what was meant:

> 1) Big Bang theory did, in fact, predict the total mass of the
> universe.
> [In this context, it was explained, within the programme, (a) that the
> total mass of atoms and molecules (by which I presume they meant
> baryonic matter) only turns out to account for ~ 4% of that Big Bang
> predicted total, and (b) subsequent observations established that
> 'dark matter' only contributes a further 21% to that predicted
> total.]

This is too vague to judge. The baryonic fraction can be predicted
quite well, but at least as stated not the amount of dark matter. Not
predicted. It can be observed, indirectly, but that is something
different.

> 2) Following the unexpected discovery

It wasn't unexpected by everyone.

> of an accelerating expansion of
> the universe (which I presume refers to the findings of the
> multinational high-z supernova search team), the resultant postulated
> 'dark energy' turned out, remarkably, to have a required equivalent
> mass that is precisely that extra 75% that had already been predicted
> by Big Bang theory.

This probably means that "inflation" "predicted" a flat universe.
Indeed, all the components (baryons, dark matter, "dark energy") do
indeed add up quite nicely to make the universe at least approximately
flat.

> 3) The resultant now "Standard Model" of astronomy has since been
> run on computer simulation and found to model the dynamics, and
> observable results of the evolution of the universe remarkably closely.

Use these parameters as the "background" for numerical simulations and
find that they agree with observations. Actually, not "remarkable",
since that is what one expects if one has the model correct and the
correct input parameters.

> In the context of (1), since that assertion flatly contradicts prior
> comments at sci.astro.research[1],

The statement above is too vague to say that it contradicts prior
comments here (and of course some prior comments here contradict
others).

> In the context of (2), this implies that the observed rate of
> acceleration of the expansion of the universe is known quite
> accurately,

It is know to 10% or so. However, what is usually "measured" is
Omega_x, which is actually independent of the rate of expansion. The
rate of expansion (the Hubble constant) is needed to translate an Omega
value into a physical density, but is not needed when discussing
relative amounts.

rlold...@amherst.edu

unread,
Oct 12, 2006, 1:02:51 PM10/12/06
to
John (Liberty) Bell wrote:
> The following is a copy of my response to a BBC programme, which made
> some exciting (if true) claims about the recent successes of Big Bang
> theory. Since these claims went well beyond my own knowledge in this
> area, I would appreciate any relevant comments from anyone.
>

One of the most persistent and disturbing problems found in discussions
of the standard Big Bang cosmological model concerns the blurring of
distinctions between true predictions and mere retrodictions.

The background radiation, global expansion, abundances of light
elements, large-scale homogeneity, etc. are often cited as successful
"predictions". However, when one does a more thorough search of the
scientific literature, one finds that most of the claimed "predictions"
were in fact retrodictions, i.e., after-the-fact explanations of
already discovered facts or approximate results.
The few genuine predictions were often considerably off the mark, and
had to be adjusted, often more than once, as in the case of the
temperature of the microwave background, the level of fluctuations in
the background, and the scale at which "homogeneity" would be found.

The Big Bang model is an exercise in model-building, wherein one
tinkers with the physics and the adjustable parameters until it
reproduces existing observations. To be fair, it does a pretty good
job of modelling the general properties of the observable universe, but
it is, and has always been, rather plastic.

The Big Bang model did not predict or even anticipate the existence of
the dark matter that dominates the observable universe. Its primary
retrodiction for the dark matter, found in nearly all discussions of
the standard Big Bang model and "precision cosmology", is that it is in
the form of hypothetical CDM particles, like axions. After decades of
searching, these hypothetical particles have still not been detected.
Stellar-mass dark matter, which contradicts the Big Bang retrodiction,
appears to have been observed by numerous groups in differing
observational experiments, but this scientific evidence is
downplayed by many in the cosmological community, perhaps because it is
an unwanted result that does not fit in well with the Big Bang model.

The Big Bang model has difficulties with respect to explanations for
why galaxies exist at all, how galaxies form, the existence and nature
of the dark matter, and the succession of ever-larger-scale deviations
from homogenity as dependable observations have reached larger scales.
One could go on at length, but you get the picture.

Perhaps the most important thing in terms of the future of science is
this unfortunate and apparently growing neglect of the crucial
distinction between true predictions and mere retrodictions. If
scientists, through ignorance or bias, fail to protect the very
special status of true predictions in science, and carefully identify
retrodictions as nothing more than consistency checks, then science is
in very deep trouble.

Robert Oldershaw

Oh No

unread,
Oct 13, 2006, 4:22:07 AM10/13/06
to
Thus spake "rlold...@amherst.edu" <rlold...@amherst.edu>

>The Big Bang model did not predict or even anticipate the existence of
>the dark matter that dominates the observable universe. Its primary
>retrodiction for the dark matter, found in nearly all discussions of
>the standard Big Bang model and "precision cosmology", is that it is in
>the form of hypothetical CDM particles, like axions. After decades of
>searching, these hypothetical particles have still not been detected.

Not only that, but there is no theory for them in elementary particle
physics, in which there are other consistency checks which allow only
those particles which we actually do detect. These exotic particles are
still theoretically impossible, as well as being undetected.


>Stellar-mass dark matter, which contradicts the Big Bang retrodiction,
>appears to have been observed by numerous groups in differing
>observational experiments, but this scientific evidence is
>downplayed by many in the cosmological community, perhaps because it is
>an unwanted result that does not fit in well with the Big Bang model.

Yes. One should be honest, both CDM and the cosmological constant are
fixes, brought about because observation seems to demand them. However,
if one studies general relativity properly it is also apparent that the
overall structure, including the big bang itself is pretty well forced
upon us from simple, undeniable assumptions. There is, imv, only one
place in which it can reasonably be altered, namely the affine
connection is suspect. Moreover there is a pressing theoretical reason
why it should be changed, namely that general relativity is not, in its
present form, compatible with quantum theory. That explains the
motivation for my research in the teleconnection. The teleconnection
does not alter the overall cosmological structure, which includes a big
bang. But it does alter the interpretation of observations, such that
neither CDM nor Lambda are required and it restores the original
preferred model of a closed finite universe with a big crunch.


>
>The Big Bang model has difficulties with respect to explanations for
>why galaxies exist at all, how galaxies form,

Well, it has problems explaining how galaxies formed in the available
timescale. Again that is resolved by the teleconnection in which
redshift is greater at given distance, and the universe is expands more
slowly.

>Perhaps the most important thing in terms of the future of science is
>this unfortunate and apparently growing neglect of the crucial
>distinction between true predictions and mere retrodictions. If
>scientists, through ignorance or bias, fail to protect the very
>special status of true predictions in science, and carefully identify
>retrodictions as nothing more than consistency checks, then science is
>in very deep trouble.
>

I do not entirely disagree with this. However, there are some
consistency checks which are of a very rigorous nature, and I do not
think it reasonable to call them "mere". For example big bang
nucleosynthesis gives a very precise figure for the proton-neutron
balance, depending on the rate of expansion of the early universe. This
is based on very well understood processes in particle physics, and it
is essential that the observed balance is consistent with Hubble's
constant.

rlold...@amherst.edu

unread,
Oct 13, 2006, 5:06:26 PM10/13/06
to
Oh No wrote:
> Thus spake "rlold...@amherst.edu" <rlold...@amherst.edu>

>
> >Perhaps the most important thing in terms of the future of science is
> >this unfortunate and apparently growing neglect of the crucial
> >distinction between true predictions and mere retrodictions. If
> >scientists, through ignorance or bias, fail to protect the very
> >special status of true predictions in science, and carefully identify
> >retrodictions as nothing more than consistency checks, then science is
> >in very deep trouble.
> >
> I do not entirely disagree with this. However, there are some
> consistency checks which are of a very rigorous nature, and I do not
> think it reasonable to call them "mere". For example big bang
> nucleosynthesis gives a very precise figure for the proton-neutron
> balance, depending on the rate of expansion of the early universe. This
> is based on very well understood processes in particle physics, and it
> is essential that the observed balance is consistent with Hubble's
> constant.

Firstly, let me assure group members that I think the evidence for
global expansion of the observable universe, indicating some form of
explosive event in our little corner of the Universe, is quite strong.
Therefore, I am in basic agreement with the standard Big Bang paradigm
as a first approximation to what is occurring locally.

However, I think we are very close to discovering a new and far more
encompassing paradigm, which subsumes the Big Bang paradigm and offers
a more coherent understanding of what we observe from the smallest of
elementary particles to pulsars and stars to galaxies and to the
limited portion of the metagalaxy that is observable. I think we are
moving slowly and somewhat chaotically towards a discrete, unbounded
(in space, time and scale), fractal paradigm for nature.

But here is the main point: we have many competing ideas for new ways
to understand nature. How are we to decide among infinite fractal
paradigms, teleconnections, landscapes, cosmic strings, etc.? The good
news is that there is a bona fide scientific way to do this, if we are
willing to stick to the principles upon which science was founded.

The correct path forward can be unambiguously determined by the answer
to the dark matter enigma. Here is how.

1. The dark matter component is such an important component of the
observable universe that any theory, model or paradigm that did not
anticipate it is immediately suspect and identified as seriously
incomplete.

2. Any theory, model or paradigm that expects to treated as more than
abstract arm-waving must predict or retrodict a specific candidate for
the dark matter. If any paradigm/theory cannot come up with its own
unique and definitive candidate solution to the dark matter enigma,
then I submit that the paradigm/theory is of little use to the science
of nature. It may be a fun and challenging exercise in abstract
thought, but it is nothing more.

3. Finally, we let observations decide which prediction/retrodiction is
the most accurate, and the paradigm associated with that
prediction/retrodiction will have been revealed as the correct path
forward.

Does your teleconnection model lead to definitive predictions about the
dark matter?

The discrete fractal paradigm, decades ago, said that the dark matter
must exist and this has been strongly supported. This paradigm
definitively predicted that the dark matter is in the form of
Kerr-Newman black holes with major peaks at 8x10^-5, 0.15 and 0.58
solar masses. Tentative microlensing results are consistent with these
predictions.

If people believe strongly in competing paradigms, let us see them come
forward in a scientific manner, and put their competing predictions in
writing before the community of scientists. And let us see them accept
nature's verdict when that becomes known. If the discrete fractal
paradigm fails the dark matter test, I will readily admit it.

[Mod. note: can I remind posters again that it is not enough to assert
that their favourite model explains all the mysteries of the universe:
it's necessary to provide references, preferably to published
peer-reviewed work in a mainstream physics journal, or at least some
attempt at justification for your assertions that can be evaluated on
its merits. If you can't do this, please take your postings to
alt.sci.physics.new-theories. I am letting this through because most
of it is discussion rather than unsupported assertion, but further
posting along these lines is discouraged -- mjh]

Oh No

unread,
Oct 14, 2006, 4:22:17 PM10/14/06
to
Thus spake "rlold...@amherst.edu" <rlold...@amherst.edu>

>The correct path forward can be unambiguously determined by the answer
>to the dark matter enigma. Here is how.
>
>1. The dark matter component is such an important component of the
>observable universe that any theory, model or paradigm that did not
>anticipate it is immediately suspect and identified as seriously
>incomplete.

One should distinguish different forms of dark matter. There is
conventional dark matter, for example some (the exact proportion is not
known) baryonic matter is dark. Neutrinoes are hot dark matter, and
account for an unknown proportion, in accordance with there mass.
Primordial black holes, if they exist, are not counted as baryonic, but
may again account for an unknown fraction. But the standard model also
requires Cold Dark Matter, which may include primordial black holes, but
certainly does not consist entirely of primordial black holes or they
would have been detected with microlensing. In addition, the standard
model requires that cold dark matter haloes have a particular profile to
account for galaxy rotation curves. What bugs me slightly is that I
think this particular prediction actually falsifies the standard model.
The profile is not found from either lensing or from evolution models,
and so far independent tests of CDM in particular local tests, e.g. in
globular clusters, always seems to come up with zilch.

>Does your teleconnection model lead to definitive predictions about the
>dark matter?

Yes. The background to the teleconnection model is described in detail
in gr-qc/0508077, and the predictions are found in gr-qc/0604047. The
model is essentially standard Friedmann-Robertson-Walker, but
cosmological redshift is reinterpreted such that it has half the
standard expansion rate, so requires 1/4 of the critical density for
closure. This means that baryonic matter can form 10-20% of its mass,
and the remaining can be neutrinos, so that no cold dark matter is
required, although I have no particular objection to primordial black
holes. The model explains galaxy rotation curves without requiring CDM
haloes.

>The discrete fractal paradigm, decades ago, said that the dark matter
>must exist and this has been strongly supported.

We know a lot of matter is dark in all models. Even within the last few
years, hosts of new stars were found in the ultraviolet. There may be an
undisclosed number of brown dwarves. The real issue is whether exotic
dark matter is required, stuff which has no place in particle physics.

> This paradigm
>definitively predicted that the dark matter is in the form of
>Kerr-Newman black holes with major peaks at 8x10^-5, 0.15 and 0.58
>solar masses. Tentative microlensing results are consistent with these
>predictions.

As I say, I have no objection in principle, to the potential existence
of primordial black holes, but we know from microlensing there are not
enough of them to account for the cold dark matter in the standard
model, and nor do galaxy profiles obey the laws which would be expected
of conventional gravitating bodies. Some more exotic solution seems to
be required.

rlold...@amherst.edu

unread,
Oct 15, 2006, 9:37:00 AM10/15/06
to
Oh No wrote:
>> >Does your teleconnection model lead to definitive predictions about the
> >dark matter?
>
> Yes. The background to the teleconnection model is described in detail
> in gr-qc/0508077, and the predictions are found in gr-qc/0604047. The
> model is essentially standard Friedmann-Robertson-Walker, but
> cosmological redshift is reinterpreted such that it has half the
> standard expansion rate, so requires 1/4 of the critical density for
> closure. This means that baryonic matter can form 10-20% of its mass,
> and the remaining can be neutrinos, so that no cold dark matter is
> required, although I have no particular objection to primordial black
> holes. The model explains galaxy rotation curves without requiring CDM
> haloes.
>

Do I understand correctly that, regarding the dark matter issue, the
teleconnection model says that the critical density is 1/4 of the usual
value and thus much less dark matter is required? You seem to imply
that the additional matter could be neutrinos (80-90%!), CDM,
primordial black holes, etc. Do I misunderstand, or does the
teleconnection model not make a specific prediction about the detailed
nature of the dark matter?


> >This paradigm (discrete fractal paradigm)


> >definitively predicted that the dark matter is in the form of
> >Kerr-Newman black holes with major peaks at 8x10^-5, 0.15 and 0.58
> >solar masses. Tentative microlensing results are consistent with these
> >predictions.

The original prediction was made in ApJ 322, 34-36, 1987.

A discussion of observational results and how they do or do not agree
with the predictions of various paradigms can be found at Fractals 10,
27-38, March 2002. An easy way to get a look at this paper is to go to
my website at www.amherst.edu/~rloldershaw, then click on "selected
papers", and then choose paper #5. The graph summarizing the
observational data says it all!

> As I say, I have no objection in principle, to the potential existence
> of primordial black holes, but we know from microlensing there are not
> enough of them to account for the cold dark matter in the standard
> model, and nor do galaxy profiles obey the laws which would be expected
> of conventional gravitating bodies. Some more exotic solution seems to
> be required.
>

I think we are probably in for a lot of surprises. The more we press
our theories, models and paradigms for specific, definitive
predictions, the more we will learn when the observational data becomes
available. Eschew plasticity, embrace principles!

Rob

Oh No

unread,
Oct 15, 2006, 5:30:30 PM10/15/06
to
Thus spake "rlold...@amherst.edu" <rlold...@amherst.edu>

>Oh No wrote:
>>> >Does your teleconnection model lead to definitive predictions about the
>> >dark matter?
>>
>> Yes. The background to the teleconnection model is described in detail
>> in gr-qc/0508077, and the predictions are found in gr-qc/0604047. The
>> model is essentially standard Friedmann-Robertson-Walker, but
>> cosmological redshift is reinterpreted such that it has half the
>> standard expansion rate, so requires 1/4 of the critical density for
>> closure. This means that baryonic matter can form 10-20% of its mass,
>> and the remaining can be neutrinos, so that no cold dark matter is
>> required, although I have no particular objection to primordial black
>> holes. The model explains galaxy rotation curves without requiring CDM
>> haloes.
>>
>
>Do I understand correctly that, regarding the dark matter issue, the
>teleconnection model says that the critical density is 1/4 of the usual
>value and thus much less dark matter is required?

That's right.

> You seem to imply
>that the additional matter could be neutrinos (80-90%!), CDM,
>primordial black holes, etc. Do I misunderstand, or does the
>teleconnection model not make a specific prediction about the detailed
>nature of the dark matter?

No specific prediction, except for remarking that exotic dark matter is
not required. Otherwise dark matter should take forms allowed by
conventional physics, protostars, dwarfs, neutrinoes, normal kinds of
thing which we expect to be dark.

rlold...@amherst.edu

unread,
Oct 16, 2006, 4:03:59 AM10/16/06
to
Oh No wrote:
> >>
> >
> >Do I understand correctly that, regarding the dark matter issue, the
> >teleconnection model says that the critical density is 1/4 of the usual
> >value and thus much less dark matter is required?
>
> That's right.
>
> > You seem to imply
> >that the additional matter could be neutrinos (80-90%!), CDM,
> >primordial black holes, etc. Do I misunderstand, or does the
> >teleconnection model not make a specific prediction about the detailed
> >nature of the dark matter?
>
> No specific prediction, except for remarking that exotic dark matter is
> not required. Otherwise dark matter should take forms allowed by
> conventional physics, protostars, dwarfs, neutrinoes, normal kinds of
> thing which we expect to be dark.
>

Ok, now I understand your position on the dark matter better.

The dark matter issue is one of the most important issues facing
astrophysicists today, but it is certainly not the only one. I can
envision cosmological hypotheses that do not specifically make
predictions about the dark matter, but do make testable predictions
about other important issues.

Ideally scientific theories lead to definitive predictions, which are
prior to observational answers, are of fundamental importance, are
unique to the theory, and are non-adjustable. A theory that cannot
make such definitive predictions is in a pre-scientific speculative
stage. There is nothing wrong with this speculation, but without a way
to adequately test the speculation, it should not be confused with, or
conflated with, mature science.

If the teleconnection model naturally leads to definitive predictions,
I would like to hear more about them.

If others who study the field of cosmology and/or read posts in this
news group would like to identify definitive predictions made by other
theories, I would very much like to hear your examples, especially if
they apply to the dark matter. If we have trouble identifying such
definitive predictions, does that tell us something quite
uncomfortable, but important to acknowledge, about the state of
astrophysics?

Oh No

unread,
Oct 16, 2006, 6:10:11 AM10/16/06
to
Thus spake "rlold...@amherst.edu" <rlold...@amherst.edu>

>Ideally scientific theories lead to definitive predictions, which are
>prior to observational answers, are of fundamental importance, are
>unique to the theory, and are non-adjustable. A theory that cannot
>make such definitive predictions is in a pre-scientific speculative
>stage. There is nothing wrong with this speculation, but without a way
>to adequately test the speculation, it should not be confused with, or
>conflated with, mature science.
>
>If the teleconnection model naturally leads to definitive predictions,
>I would like to hear more about them.

The detail is given in gr-qc/0604047, Does a Teleconnection between
Quantum States account for Missing Mass, Galaxy Ageing, Lensing
Anomalies, Supernova Redshift, MOND, and Pioneer Blueshift?


The central predictive result of the model is that cosmological redshift
goes as the square of the expansion parameter.

1+z = a^2(t)/a0^2

which replaces the usually linear relation. In other respects, classical
general relativity is obeyed and this is a model obeying a standard FRW
cosmology. The redshift relation is consistent with observation for a
universe expanding at half the currently accepted rate, which is thus
twice as old as a standard model with the same cosmological parameters,
and requires 1/4 of the critical density for closure. There follow
directly testable revisions to the distance-redshift relation, and to
the age-redshift relation.

Testing so far is of course retrodiction, in that I have only been able
to test against observations already made. However, most of these tests
will be done with much more extensive observations over the next 15 or
so years, and the predictions should by that time clearly differentiate
this from the standard model. The model has fewer adjustable parameters
than standard, because I am not using CDM or a cosmological constant (a
cosmological constant is possible, but theoretical prejudice would
prefer not to have it).

The result of supernova analysis is that the magnitude-redshift relation
produces a fit marginally better than standard for a closed, zero Lambda
universe with density Omega=1.89 (after rescaling omega st critical
density is Omega=1). Using 225 supernovae from the Riess and Astier data
sets I have a best fit chi^2 value of 210.8, compared to 212.5 for the
concordance model. The curves are very close - less than 0.1 magnitude
up to z=1.5, and a large number of observations around z=2 would be
needed to get a statistically significant test. This should come from
the SNAP mission.

The model has it that red galaxies are very much older than standard -
not only is the universe older, but we are not looking so far back in
time at high red shift. The standard model is already distressed by the
problem of galaxy ageing, but the model makes a definite prediction that
the next generation of very large telescopes will reveal large numbers
of mature galaxies at high redshifts.

The model predicts that galaxy mass profiles will broadly follow the
visible profiles. There are a number of programmes studying lensing
profiles which show this to be the case, and which show inconsistencies
in CDM models.

The model predicts (retrodicts) the MONDian term in galaxy rotation
curves. However it ascribes this to a behaviour of redshift affecting
the interpretation of Doppler information, and preserves Newtonian
Dynamics. This will become a prediction at the point when astronomical
measurement of the Milky Way becomes accurate enough for direct
measurement of stellar motions. At the moment the only anomalous
measurement I have found is in Hipparcos parallax measurements of a few
globular clusters (notably Pleiades). The analysis is currently beyond
me, but I have ascertained that the anomaly is of the order of magnitude
(~1 mas) at which differences in motions predicted by the model can be
detected. The Gaia mission should produce data which resolve this.

I do not know whether the anomalous motion of IM Pegasi in VLBI
measurements can be attributed to the model. Full data is not due to be
released until next year. Detection of possible anomalous Doppler shifts
in motions of planets in the solar system requires an order of magnitude
improvement in the resolution of the best current echelle spectrometers.

The model retrodicts the anomalous shift in Doppler data from Pioneer.
The analysis is subtle, so there may be room for error, but there is a
prediction that this anomalous shift would be removed if direct
measurement of position was also available. It is possible that more
will be revealed by planetary flybys or by further analysis of old
Pioneer data, but I think a dedicated mission to test the anomaly is
strictly required.

rlold...@amherst.edu

unread,
Oct 16, 2006, 1:02:23 PM10/16/06
to
Oh No wrote:
>
> The central predictive result of the model is that cosmological redshift
> goes as the square of the expansion parameter.
>
> 1+z = a^2(t)/a0^2
>
> which replaces the usually linear relation. In other respects, classical
> general relativity is obeyed and this is a model obeying a standard FRW
> cosmology. The redshift relation is consistent with observation for a
> universe expanding at half the currently accepted rate, which is thus
> twice as old as a standard model with the same cosmological parameters,
> and requires 1/4 of the critical density for closure. There follow
> directly testable revisions to the distance-redshift relation, and to
> the age-redshift relation.
>
>
> The model has it that red galaxies are very much older than standard -
> not only is the universe older, but we are not looking so far back in
> time at high red shift. The standard model is already distressed by the
> problem of galaxy ageing, but the model makes a definite prediction that
> the next generation of very large telescopes will reveal large numbers
> of mature galaxies at high redshifts.
>
> The model predicts that galaxy mass profiles will broadly follow the
> visible profiles. There are a number of programmes studying lensing
> profiles which show this to be the case, and which show inconsistencies
> in CDM models.
>
> The model predicts (retrodicts) the MONDian term in galaxy rotation
> curves. However it ascribes this to a behaviour of redshift affecting
> the interpretation of Doppler information, and preserves Newtonian
> Dynamics. This will become a prediction at the point when astronomical
> measurement of the Milky Way becomes accurate enough for direct
> measurement of stellar motions. .

>
> The model retrodicts the anomalous shift in Doppler data from Pioneer.
> The analysis is subtle, so there may be room for error, but there is a
> prediction that this anomalous shift would be removed if direct
> measurement of position was also available. It is possible that more
> will be revealed by planetary flybys or by further analysis of old
> Pioneer data, but I think a dedicated mission to test the anomaly is
> strictly required.


Well, there can be no doubt that the teleconnection model has taken
that important first step: making definitive predictions. I applaud
your willingness to make clear testable predictions.

The main predictions of revised redshift relations would seem to be
testable within the near future. This is also the case for your
prediction of large numbers of mature galaxies at large redshifts. The
last 3 predictions may be more difficult to prove one way or the other,
but in principle are they valid predictions/retrodictions.

Now all we need are the observational data.

Turning to proponents of the standard Big Bang paradigm, are there
definitive predictions that this paradigm makes by which we could put
it to the acid test, so to speak. If the dark matter were found not to
be in the form of CDM, would that indicate a serious problem with the
paradigm? Are there other tests that would verify robustness the
paradigm, or clearly show us that its limitations have been reached?

Rob

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 17, 2006, 7:47:08 AM10/17/06
to
In article <mt2.0-22825...@hercules.herts.ac.uk>,
"rlold...@amherst.edu" <rlold...@amherst.edu> writes:

> One of the most persistent and disturbing problems found in discussions
> of the standard Big Bang cosmological model concerns the blurring of
> distinctions between true predictions and mere retrodictions.

Also, what is included in the term "big bang" is variable.

> The background radiation, global expansion, abundances of light
> elements, large-scale homogeneity, etc. are often cited as successful
> "predictions".

Prediction, prior observation, prediction, assumption later verified by
observation.

> The Big Bang model has difficulties with respect to explanations for
> why galaxies exist at all, how galaxies form, the existence and nature
> of the dark matter, and the succession of ever-larger-scale deviations
> from homogenity as dependable observations have reached larger scales.
> One could go on at length, but you get the picture.

Galaxy formation is not really a central tenant of the big bang. In
other words, we need to distinguish between "the universe is expanding
from a former state which was much hotter and much denser" and "we
understand everything in the universe". In particular, not
understanding galaxy formation doesn't imply that there is any reason at
all to doubt the big bang in the narrower sense of the term. There is
also nothing in principle wrong with the fact that we don't completely
understand galaxy formation---it just means that there is more work to
do. New species of animals are being discovered all the time. That
doesn't mean that zoology is somehow fundamentally flawed or that the
discovery of new animals requires a radical reformulation of zoology.
Science is a way of thinking, not a collection of facts.

Joseph Lazio

unread,
Oct 17, 2006, 7:56:50 AM10/17/06
to
>>>>> "re" == rloldershaw@amherst edu <rlold...@amherst.edu> writes:

re> One of the most persistent and disturbing problems found in
re> discussions of the standard Big Bang cosmological model concerns
re> the blurring of distinctions between true predictions and mere
re> retrodictions.

re> The background radiation, global expansion, abundances of light
re> elements, large-scale homogeneity, etc. are often cited as
re> successful "predictions". However, when one does a more thorough
re> search of the scientific literature, one finds that most of the
re> claimed "predictions" were in fact retrodictions, i.e.,
re> after-the-fact explanations of already discovered facts or
re> approximate results. The few genuine predictions were often
re> considerably off the mark, and had to be adjusted, often more than
re> once, as in the case of the temperature of the microwave
re> background, the level of fluctuations in the background, and the
re> scale at which "homogeneity" would be found.
[...]
re> The Big Bang model did not predict or even anticipate the
re> existence of the dark matter that dominates the observable
re> universe.

This is a oft-repeated claim, but one that doesn't make a lot of
sense.

Even fairly basic descriptions of the Big Bang model explain that
there are three possibilities (assuming that the cosmological constant
is 0):
1. The matter density in the Universe is high enough that the
expansion eventually slows and reverses (leading to the "Big
Crunch"). Such a universe is termed "closed."
2. The matter density in the Universe is not high enough to reverse
the expansion. The Universe continues to expand forever. Such a
universe is termed "open."
3. The matter density in the Universe is at the critical value so that
the expansion ceases only after an infinite amount of time.

The Big Bang model (and general relativity from which it is derived)
do not predict the matter density of the Universe, regarding that as a
parameter to be determined from observation. Moreover, there is no
requirement for cases 1--3 that the matter be luminous (i.e., that it
interacts via the electromagnetic force). All that is required is
that it interact gravitationally, so dark matter, luminous matter, or
both are allowed.

A robust analogy is to trajectory of an object. Consider a planet of
mass M and radius R (assumed spherical and without an atmosphere). We
can predict, with considerable confidence from Newton's Law of
Universal Gravitation, that the acceleration due to gravity at the
surface of such a planet is a_g = GM/R^2. Does an object of initial
velocity v fall back to the surface of this planet?

--
Lt. Lazio, HTML police | e-mail: jla...@patriot.net
No means no, stop rape. | http://patriot.net/%7Ejlazio/
sci.astro FAQ at http://sciastro.astronomy.net/sci.astro.html

rlold...@amherst.edu

unread,
Oct 18, 2006, 4:06:17 AM10/18/06
to
Phillip Helbig---remove CLOTHES to reply wrote:
>
> Prediction, prior observation, prediction, assumption later verified by
> observation.

By my accounting this list should read:

1. Background Radiation: prediction, but initially off by about 400%
and had to be adjusted to come into agreement with observations.

2. Global expansion: agreed, this was prior knowledge.

3. Abundances of light elements: definitely not predicted! We had good
approximate abundances prior to any BB paradigm. Also, the theoretical
abundances have been repeatedly revised as the observational situation
has changed, especially with helium, deuterium and lithium. Even
today, articles appear which question how well the BB paradigm is able
to retrodict these abundances. For an introduction to this important
issue, see Oldershaw, R.L., American J. of Physics, vol. 56, 1075-1081,
1988.

4. Assumption invoked to simplify mathematics, which has evolved into
near dogma. See the reference above for a discussion of this issue.
If we believe that either the observable universe or the Universe is
"homogeneous", then we do so out of a near-religious faith and a desire
to protect the standard paradigm. There is an ongoing debate over
whether the very large-scale distribution of galaxies is fractal, as on
smaller scales, or whether it goes over into a statistically
homogeneous distribution on large enough scales. The proponents of the
fractal distribution have pointed out that the homogeneity proponents
subtly assume homogeneity from the beginning in their analysis of the
data. A scientist should take a prudent stance with respect to this
controversy. We are at the limits of observability and are definitely
not sure if we are afforded a representative sample, even with respect
to the observable universe.

>There is
> also nothing in principle wrong with the fact that we don't completely
> understand galaxy formation---it just means that there is more work to
> do. New species of animals are being discovered all the time. That
> doesn't mean that zoology is somehow fundamentally flawed or that the
> discovery of new animals requires a radical reformulation of zoology.
> Science is a way of thinking, not a collection of facts.

Ok, but quite simply: is there some prediction or test by which the
new, and heavily reworked, modified, adjusted, Big Bang paradigm can be
falsified, or does the present incarnation of the Big Bang paradigm not
meet Popper's criterion for science?

rlold...@amherst.edu

unread,
Oct 18, 2006, 4:15:43 AM10/18/06
to
So, is your basic argument that the Big Bang paradigm is a
one-size-fits-all model which can accommodate new observational
discoveries by morphing and adding on epicycles without limits? Does
this paradigm make testable predictions which will allow us to probe
its limits of applicability?

I would like you to consider this comparison.

The discrete fractal paradigm subsumes the Big Bang paradigm. It does
not say the BB paradigm is wrong, but rather that it is a good
approximation to what is happening locally in an infinite discrete
fractal cosmos. For a wealth of information on this paradigm (at
various levels) see www.amherst.edu/~rloldershaw .

And here is the key scientific difference between the discrete fractal
paradigm and the BB paradigm. The fractal paradigm predicts,
unequivocally, what the dark matter must be, and thereby subjects
itself to a scientific test of the highest stringency. Can the Big
Bang paradigm match this challenge and come up with a definitive
prediction by which it could be tested in a similar fashion. I suspect
that it cannot come up with a prediction that is prior, unique to the
paradigm, fundamental and nonadjustable. In my view, the discrete
fractal paradigm may not be fully developed yet, but at least it is
science. The Big Bang paradigm is regarded as "right" by the
overwhelming majority of cosmologists, but if it does not make any new
definitive predictions and is excessively "adjustable", is it still
subject to the rules of science?

Joseph Lazio

unread,
Oct 18, 2006, 7:38:41 AM10/18/06
to
>>>>> "re" == rloldershaw@amherst edu <rlold...@amherst.edu> writes:

re> Joseph Lazio wrote:
>> This is a oft-repeated claim, but one that doesn't make a lot of
>> sense.
>>
>> Even fairly basic descriptions of the Big Bang model explain that

>> there are three possibilities (...): 1. The matter density in the


>> Universe is high enough that the expansion eventually slows and

>> reverses (...). Such a universe is termed "closed." 2. The matter


>> density in the Universe is not high enough to reverse the
>> expansion. The Universe continues to expand forever. Such a
>> universe is termed "open." 3. The matter density in the Universe
>> is at the critical value so that the expansion ceases only after an
>> infinite amount of time.
>>

>> The Big Bang model (...) do not predict the matter density of the


>> Universe, regarding that as a parameter to be determined from
>> observation. Moreover, there is no requirement for cases 1--3 that

>> the matter be luminous (...). All that is required is that it


>> interact gravitationally, so dark matter, luminous matter, or both
>> are allowed.
>>
>> A robust analogy is to trajectory of an object. Consider a planet

>> of mass M and radius R (...). We can predict, with considerable


>> confidence from Newton's Law of Universal Gravitation, that the
>> acceleration due to gravity at the surface of such a planet is a_g
>> = GM/R^2. Does an object of initial velocity v fall back to the
>> surface of this planet?

re> So, is your basic argument that the Big Bang paradigm is a
re> one-size-fits-all model which can accommodate new observational
re> discoveries by morphing and adding on epicycles without limits?
re> Does this paradigm make testable predictions which will allow us
re> to probe its limits of applicability?
[...]

I notice that you didn't answer my question: Does an object of initial


velocity v fall back to the surface of this planet?

I fear that this is straying from astronomy into philosophy of
science, but I don't understand your reasoning. By your apparent
logic, Newton's Law of Universal Gravitation does not predict the
trajectory of an arbitrary object near a planet of a specified size
and mass.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 18, 2006, 7:37:18 AM10/18/06
to
In article <mt2.0-32486...@hercules.herts.ac.uk>,
"rlold...@amherst.edu" <rlold...@amherst.edu> writes:

> 1. Background Radiation: prediction, but initially off by about 400%

Not bad for a first guess.

> and had to be adjusted to come into agreement with observations.

This makes it sound like it was "fudged". Of course, theory and
observations are give-and-take and discrepancy might lead one to correct
the theory, but you make it sound like one could arbitrarily fit any
observations.

> 4. Assumption invoked to simplify mathematics, which has evolved into
> near dogma. See the reference above for a discussion of this issue.
> If we believe that either the observable universe or the Universe is
> "homogeneous", then we do so out of a near-religious faith and a desire
> to protect the standard paradigm. There is an ongoing debate over
> whether the very large-scale distribution of galaxies is fractal,

It is ongoing, but not in serious circles. There IS a scale above which
there is large-scale homogeneity.

> Ok, but quite simply: is there some prediction or test by which the
> new, and heavily reworked, modified, adjusted, Big Bang paradigm can be
> falsified, or does the present incarnation of the Big Bang paradigm not
> meet Popper's criterion for science?

Certainly. But you have to define exactly what you mean first. Second,
it is a widespread misconception that disproving one aspect of a theory
also disproves the foundations. That is not necessarily the case.

Joseph Lazio

unread,
Oct 18, 2006, 7:36:31 AM10/18/06
to
>>>>> "re" == rloldershaw@amherst edu <rlold...@amherst.edu> writes:

[Regarding predictions of the Big Bang model]

re> 3. Abundances of light elements: definitely not predicted! We had
re> good approximate abundances prior to any BB paradigm. Also, the
re> theoretical abundances have been repeatedly revised as the
re> observational situation has changed, especially with helium,
re> deuterium and lithium. Even today, articles appear which question
re> how well the BB paradigm is able to retrodict these abundances.

I think this is a somewhat limiting, and not very realistic, approach
to how science is done. (Also, this may be beginning to stray from
astronomy into philosophy of science.)

Theories provide a coherent explanation for facts. We may have had a
reasonably accurate accounting of the light element abundance prior to
the development of the Big Bang model. (I'll confess that I don't
know the history of this particular aspect all that well.) That still
begs the question of why the light element abundance is what it is.

It's one thing to measure the light element abundance. Simply knowing
that the abundance of deuterium is some value, the abundance of helium
is another value, and that of lithium is yet another value is nice,
but this is merely a collection of facts.

The Big Bang model predicts that there should be some light elements
in the Universe not manufactured by stars, because for a short while in
the past the Universe had the density and temperature of the interior
of a star and fusion was possible. The Big Bang model also predicts
that the abundances of these light elements are, in some sense,
related and that they are related to the photon-to-baryon ratio.

Worrying about whether the actual abundance values were measured
before or after the development of the Big Bang model itself misses
the point.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 18, 2006, 7:37:59 AM10/18/06
to

> So, is your basic argument that the Big Bang paradigm is a
> one-size-fits-all model which can accommodate new observational
> discoveries by morphing and adding on epicycles without limits?

Newtonian gravity allows a planet at any distance from the sun, unlike
Kepler's wrong geometrical conclusions. If that is "one size fits all",
then so is ANY theory with a free parameter. No epicycles, though.

rlold...@amherst.edu

unread,
Oct 19, 2006, 4:03:17 AM10/19/06
to
Phillip Helbig---remove CLOTHES to reply wrote:


So we have agreement on several important issues, but some remaining
differences, as discussed below.


>> There is an ongoing debate over whether the very large-scale distribution of galaxies is fractal,

>> or goes over into a homogeneous distribution.


>
> It is ongoing, but not in serious circles. There IS a scale above which
> there is large-scale homogeneity.

"Serious" cosmologists can say that as many times as they like, but it
is still a matter of faith, and I use that term literally. What
scientific proof do you offer for your claim of homogeneity? The
microwave background used to be the best supporting evidence, but many
strange anomalies (recently the possibility of a quadrupole anisotropy)
and correlations with the Solar System's ecliptic, etc. have clearly
shown that we probably have much more to learn about the large-scale
structure of the observable universe? As scientists, we need be sure
that the evidence is unambiguous before we declare an issue closed.


> > Ok, but quite simply: is there some prediction or test by which the
> > new, and heavily reworked, modified, adjusted, Big Bang paradigm can be
> > falsified, or does the present incarnation of the Big Bang paradigm not
> > meet Popper's criterion for science?
>
> Certainly. But you have to define exactly what you mean first. Second,
> it is a widespread misconception that disproving one aspect of a theory
> also disproves the foundations. That is not necessarily the case.

Please show me at least one definitive prediction, by which we might
put the Big Bang paradigm to a rigorous, quantitative scientific test.

carlip...@physics.ucdavis.edu

unread,
Oct 19, 2006, 4:04:49 AM10/19/06
to
rlold...@amherst.edu <rlold...@amherst.edu> wrote:
> Phillip Helbig---remove CLOTHES to reply wrote:

>> Prediction, prior observation, prediction, assumption later verified by
>> observation.

> By my accounting this list should read:

> 1. Background Radiation: prediction, but initially off by about 400%
> and had to be adjusted to come into agreement with observations.

Your 400% seems way off -- Alpher and Herman's 1948 prediction was "about 5K."
Furthermore, you're not quite asking the right question. CMBR temperature
varies with time, so to predict the value "now" you need to know when "now"
is, relative to, say, primordial nucleosynthesis. Alpher and Herman used
(baryonic) matter density as a proxy for this, and got the correct relation;
their error in the exact value came from an imprecise knowledge of the
present matter density.

Remember also that the prediction was not just a temperature, but a spectrum.
Black body spectra are hard to make (since temperatures of different sources
get different red shifts); the observation of not only the temperature but
the spectrum is a very strong confirmation.

> 2. Global expansion: agreed, this was prior knowledge.

> 3. Abundances of light elements: definitely not predicted! We had good
> approximate abundances prior to any BB paradigm.

Joseph Lazio has addressed this.

You shouls also add a number of other predictions:

-- red shift dependence of CMBR temperature (for observations, well after the
predictions, see Battistelli et al., astro-ph/0208027; Srianand et al.,
astro-ph/0012222; Molaro et al., astro-ph/0111589)

-- Tolman surface brightness test (predicted by Tolman in 1930; observed by
Lubin and Sandage, astro-ph/0106566)

-- time dilation of supernova light curves (predicted by Wilson, Ap. J. 90
(1939) 634; for observations, see Goldhaber et al., astro-ph/0104382)

-- three (and no more) light neutrinos (predicted by Yang et al., Ap. J. 227
(1979) 697; confirmed in accelerator experiments later -- see, e.g., ALEPH
Collaboration, Phys. Lett. B235 (1990) 399)

Steve Carlip

John (Liberty) Bell

unread,
Oct 19, 2006, 7:57:01 AM10/19/06
to
I think I would like to respond at this point, not because I am
challenging anybody, but because I don't know enough about this
subject, and hope that, by asking some questions (pertinent or
otherwise), the resultant responses will improve my own level of
understanding.

carlip...@physics.ucdavis.edu wrote:
> rlold...@amherst.edu <rlold...@amherst.edu> wrote:
> > Phillip Helbig---remove CLOTHES to reply wrote:
>
> >> Prediction, prior observation, prediction, assumption later verified by
> >> observation.
>
> > By my accounting this list should read:
>
> > 1. Background Radiation: prediction, but initially off by about 400%
> > and had to be adjusted to come into agreement with observations.
>
> Your 400% seems way off -- Alpher and Herman's 1948 prediction was "about 5K."

The figure I seem to remember was a factor of 3 difference, but I
certainly couldn't quote references after all this time. I would guess
from the above that various people had a stab at this calculation, with
differing results. If so, cherry picking the best, after the fact,
would seem to be no more logical than cherry picking the worst.

> Furthermore, you're not quite asking the right question. CMBR temperature
> varies with time, so to predict the value "now" you need to know when "now"
> is,

I don't think anyone could argue with that.

> relative to, say, primordial nucleosynthesis.

Clarification here would be appreciated. I was under the impression
that CMBR is black body radiation at the transition from plasma physics
to the point where protons and electrons combine to form atomic
hydrogen, thus indicating the temperature at which this transition is
believed to take place.

> Remember also that the prediction was not just a temperature, but a spectrum.
> Black body spectra are hard to make (since temperatures of different sources
> get different red shifts);

Hmm. Black body radiation is black body radiation, isn't it?. Different
elements give different spectra because of bound electron transitions.
Any plasma should give the same BB spectrum (with the peak intensity
defining the temperature of that plasma).

> the observation of not only the temperature but
> the spectrum is a very strong confirmation.

Surely observed intensity is at least as important?

> You shouls also add a number of other predictions:

> -- red shift dependence of CMBR temperature (for observations, well after the
> predictions, see Battistelli et al., astro-ph/0208027; Srianand et al.,
> astro-ph/0012222; Molaro et al., astro-ph/0111589)

Yes, that certainly confirms global expansion (which I don't think
anybody doubts). However, I dont see how this necessarily nails things
down any more precisely. Take, for example, Oh No's theory (which I am
definitely not claiming to support). In this you should still get such
changing T with changing z despite the timescales being radically
different.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 19, 2006, 7:56:19 AM10/19/06
to
In article <mt2.0-30336...@hercules.herts.ac.uk>,
carlip...@physics.ucdavis.edu writes:

> Remember also that the prediction was not just a temperature, but a spectrum.
> Black body spectra are hard to make (since temperatures of different sources
> get different red shifts); the observation of not only the temperature but
> the spectrum is a very strong confirmation.

Indeed. All the research on CMB inhomogeneities (and there is a lot of
research to do) has obscured two Very Important Facts in the popular
mind: the black-body spectrum, as Steve mentioned, and the fact that the
signal is very homogeneous. The latter doesn't just mean that the
inhomogeneities are small, but is in itself very important.

Oh No

unread,
Oct 19, 2006, 7:55:40 AM10/19/06
to
Thus spake "rlold...@amherst.edu" <rlold...@amherst.edu>

>Phillip Helbig---remove CLOTHES to reply wrote:
>> In article <mt2.0-32486...@hercules.herts.ac.uk>,
>> "rlold...@amherst.edu" <rlold...@amherst.edu> writes:
>
>. What
>scientific proof do you offer for your claim of homogeneity? The
>microwave background used to be the best supporting evidence, but many
>strange anomalies (recently the possibility of a quadrupole anisotropy)
>and correlations with the Solar System's ecliptic, etc. have clearly
>shown that we probably have much more to learn about the large-scale
>structure of the observable universe? As scientists, we need be sure
>that the evidence is unambiguous before we declare an issue closed.

The large scale distribution of galaxy clusters is also observed to be
extremely homogeneous. But actually, what would be really difficult is
justifying the formulation of a theory which did not obey the
cosmological principle.

>> > Ok, but quite simply: is there some prediction or test by which the
>> > new, and heavily reworked, modified, adjusted, Big Bang paradigm can be
>> > falsified, or does the present incarnation of the Big Bang paradigm not
>> > meet Popper's criterion for science?
>>
>> Certainly. But you have to define exactly what you mean first. Second,
>> it is a widespread misconception that disproving one aspect of a theory
>> also disproves the foundations. That is not necessarily the case.
>
>Please show me at least one definitive prediction, by which we might
>put the Big Bang paradigm to a rigorous, quantitative scientific test.

I think, notwithstanding possible anomalies, the microwave background is
such a test. It's overall character is well established and in
accordance with a big bang. Only some of the detail may need attention.

Another is the proton-neutron balance. This can be precisely calculated
as a product of big bang nuclear synthesis, and depends quite critically
on Hubble's constant. Observation and prediction fit extremely well.

But there are simpler and more obvious ones. Without a big bang we would
be caught up in Olber's paradox.

A big bang is the most natural solution of Einstein's field equation
(the same is actually true in Newtonian gravity). The fact that
scientists, including Einstein himself, did not tend to adopt a big bang
model had to do with theoretical prejudice, and had nothing to do with
whether cosmological expansion was a prediction of the model. It was
only when a prediction of the model was observed by Hubble, that the
model started to be adopted.

Einstein's theory of general relativity has also been put to a number of
rigorous experimental tests, and has passed all of them while
alternative models have failed. One must accept that any model must form
a self consistent mathematical whole. One cannot accept parts of it and
then accept other things which are mathematically inconsistent. It seems
to me that any scientific model must satisfy this criterion even before
one applies observational tests. In that regard, a big bang obeying
general relativity has no serious competitors. One can hardly say this
is not the result of rigorous scientific quantititive testing.

carlip...@physics.ucdavis.edu

unread,
Oct 20, 2006, 12:43:11 PM10/20/06
to
"John (Liberty) Bell" <john...@accelerators.co.uk> wrote:
[...]

> carlip...@physics.ucdavis.edu wrote:
>> rlold...@amherst.edu <rlold...@amherst.edu> wrote:
>> > Phillip Helbig---remove CLOTHES to reply wrote:

>> >> Prediction, prior observation, prediction, assumption later
>> >> verified by observation.

>> > By my accounting this list should read:

>> > 1. Background Radiation: prediction, but initially off by about 400%
>> > and had to be adjusted to come into agreement with observations.

>> Your 400% seems way off -- Alpher and Herman's 1948 prediction was
>> "about 5K."

> The figure I seem to remember was a factor of 3 difference, but I
> certainly couldn't quote references after all this time. I would guess
> from the above that various people had a stab at this calculation, with
> differing results. If so, cherry picking the best, after the fact,
> would seem to be no more logical than cherry picking the worst.

Alpher and Herman's 1948 Nature paper was, I believe, the first to make a
prediction. There was some variation after that, but it came from lack of
a good observational value for the present density of the universe (and
probably also from the fact that people were making order-of-magnitude
estimates of a quantity they didn't expect to be observable).

>> Furthermore, you're not quite asking the right question. CMBR temperature
>> varies with time, so to predict the value "now" you need to know when "now"
>> is,

> I don't think anyone could argue with that.

>> relative to, say, primordial nucleosynthesis.

> Clarification here would be appreciated. I was under the impression
> that CMBR is black body radiation at the transition from plasma physics
> to the point where protons and electrons combine to form atomic
> hydrogen, thus indicating the temperature at which this transition is
> believed to take place.

Right, but not relevant to the present issue. To compare a past value
to a present one, we need to be able to compare two independent quantities,
one to relate the times and the other to allow a prediction of temperatures.
At recombination, we know the temperature (from the ionization energy of
hydrogen), but don't know anything obvious that would tell us the time
between then and now. At primordial nucleosynthesis, though, we know both
the temperature and the baryon density. So we can start then, predict the
evolution of both temperature and density until recombination (which then
gives us a density at recombination), and then compare the density at
recombination to density now to calibrate the time.

>> Remember also that the prediction was not just a temperature, but a
>> spectrum. Black body spectra are hard to make (since temperatures of
>> different sources get different red shifts);

> Hmm. Black body radiation is black body radiation, isn't it?

Black body radiation at a given temperature is all the same. But if you
combine two black body spectra at different temperatures, the result is
not a black body spectrum at some intermediate temperature -- it's not
black body at all. Observation of a uniform black body spectrum means
either that the universe was once a black body at a uniform temperature,
or that some huge series of implausible coincidences somehow combined a
bunch of different spectra to mimic a black body.

Steve Carlip

John (Liberty) Bell

unread,
Oct 20, 2006, 12:43:43 PM10/20/06
to
I find myself largely agreeing with Oh No here, except for the
following paragraph.

Oh No wrote:
> A big bang is the most natural solution of Einstein's field equation
> (the same is actually true in Newtonian gravity).

I think you must mean gtr. EFE (as first published) contained a
cosmological constant for the explicit purpose of preventing any such
global geometrical dynamism.

> The fact that
> scientists, including Einstein himself, did not tend to adopt a big bang
> model had to do with theoretical prejudice,

That is unfair. Every generation is a product of its educational
background. In Einstein's case that was in the 19th century, when the
thermodynamics of steam engines was still the 'state-of-the-art' in the
technological application pure physical principles.

Einstein supressed global geometrical dynamism within his mathematical
apparatus, in order to produce a general theory of relativity which was
consistent with the then known facts about the universe.

If that is prejudice, then we are certainly still guilty of it now.

Perhaps we would all be a little wiser if we learned from the lessons
of history, by recognising that nothing ever really changes, in that
respect.

rlold...@amherst.edu

unread,
Oct 20, 2006, 12:45:22 PM10/20/06
to
carlip...@physics.ucdavis.edu wrote:

> Remember also that the prediction was not just a temperature, but a spectrum.
> Black body spectra are hard to make (since temperatures of different sources

> get different red shifts); the observation of not only the temperature but
> the spectrum is a very strong confirmation.

This a good and important point. It is a crucial fact about nature,
and verfies that the BB paradigm offers a good approximation for what
is going on in the local Hubble Bubble. I regard the prediction of the
black-body spectrum and approximate temperature of the microwave
background as the best evidence for the BB approximation. I do not
accept that this means that it had to be the whole Universe that went
"pop!", or that even within the Hubble Bubble the physics is quite as
idealized as standard cosmology would have it.

Parenthetically, I have always been fascinated with the speculation
that the background radiation might be a result of the Unruh effect in
QFT, which causes an accelerated observer to observe an isotropic
black-body thermal "bath". See Schultzhold et al, PRL, 97, 121302,
2006. Low acceleration gives a low temperature.
This is a pretty far out idea, but one I am pursuing just for the fun
of it.


>
>
> You shouls also add a number of other predictions:
>
> -- red shift dependence of CMBR temperature (for observations, well after the
> predictions, see Battistelli et al., astro-ph/0208027; Srianand et al.,
> astro-ph/0012222; Molaro et al., astro-ph/0111589)
>

> -- Tolman surface brightness test (predicted by Tolman in 1930; observed by
> Lubin and Sandage, astro-ph/0106566)
>
> -- time dilation of supernova light curves (predicted by Wilson, Ap. J. 90
> (1939) 634; for observations, see Goldhaber et al., astro-ph/0104382)
>
> -- three (and no more) light neutrinos (predicted by Yang et al., Ap. J. 227
> (1979) 697; confirmed in accelerator experiments later -- see, e.g., ALEPH
> Collaboration, Phys. Lett. B235 (1990) 399)
>

Ok, but again, the first 3 just verify global expansion within the
Hubble Bubble, and do not say anything more. Personally, I have to
admit that I treat many high-energy results with a wait-and-see
approach because of the numerous false "positives", the complexity of
what they are trying to do (Feyman's: 'It's like smashing two clocks
together, and from what falls out,...) and subtle but severe pressures
to get the "right" answers.

At any rate, let us see a prediction by the BB paradigm about something
that is currently unknown, but knowable in the foreseeable future. Can
the BB paradigm do this, or not?

Rob

rlold...@amherst.edu

unread,
Oct 20, 2006, 12:45:37 PM10/20/06
to
Phillip Helbig---remove CLOTHES to reply wrote:

> Indeed. All the research on CMB inhomogeneities (and there is a lot of
> research to do) has obscured two Very Important Facts in the popular
> mind: the black-body spectrum, as Steve mentioned, and the fact that the
> signal is very homogeneous. The latter doesn't just mean that the
> inhomogeneities are small, but is in itself very important.

I agree that within the local Hubble Bubble the CMB is approximately
"homogeneous", but then there are the dipole, quadrupole and octopole
anisotropies. The difference between the Newtonian and GR predictions
for the precession of the perihelion of Mercury was "small" too. The
devil is sometimes in those "small" details, no?

Also, does the distribution of matter look homogeneous, with all those
very much unpredicted sheets, filaments and voids? I might go along
with a claim of statistical homogeneity, but claims of a more
idealized, literal homogeneity seem to me to be in conflict with
accepted observations.

Rob

rlold...@amherst.edu

unread,
Oct 20, 2006, 12:45:52 PM10/20/06
to
Phillip Helbig---remove CLOTHES to reply wrote:
> > So, is your basic argument that the Big Bang paradigm is a
> > one-size-fits-all model which can accommodate new observational
> > discoveries by morphing and adding on epicycles without limits?
>
> Newtonian gravity allows a planet at any distance from the sun, unlike
> Kepler's wrong geometrical conclusions. If that is "one size fits all",
> then so is ANY theory with a free parameter. No epicycles, though.

See my reply to Joseph Lazio (#19, I think).

More importantly, why are we talking about Newtonian gravitation? The
discussion is about cosmological paradigms and which ones can (the
discrete fractal paradigm and the teleconnection model, so far) and
which ones cannot (apparently the BB paradigm) make definitive
predictions.

When are you going to stop beating around the bush and present
quantitative predictions by which we can definitively test the BB
paradigm, or admit that it is not falsifiable.

Oh No

unread,
Oct 20, 2006, 12:45:03 PM10/20/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>

>> -- red shift dependence of CMBR temperature (for observations, well after the
>> predictions, see Battistelli et al., astro-ph/0208027; Srianand et al.,
>> astro-ph/0012222; Molaro et al., astro-ph/0111589)
>
>Yes, that certainly confirms global expansion (which I don't think
>anybody doubts). However, I dont see how this necessarily nails things
>down any more precisely. Take, for example, Oh No's theory (which I am
>definitely not claiming to support). In this you should still get such
>changing T with changing z despite the timescales being radically
>different.

In fact my proposal only affects quantum phenomena for which, by
definition, continuous observation is not possible, and for which,
incidentally, there is not an alternative existing theory. It gives
classical general relativity in the classical correspondence.
The cosmic microwave background is continuously observable, so it must
be treated classically and obeys the same redshift relationship as found
in the standard model.

rlold...@amherst.edu

unread,
Oct 20, 2006, 12:45:55 PM10/20/06
to
Joseph Lazio wrote:
> >>>>> "re" == rloldershaw@amherst edu <rlold...@amherst.edu> writes:
>>
> I notice that you didn't answer my question: Does an object of initial
> velocity v fall back to the surface of this planet?
>
> I fear that this is straying from astronomy into philosophy of
> science, but I don't understand your reasoning. By your apparent
> logic, Newton's Law of Universal Gravitation does not predict the
> trajectory of an arbitrary object near a planet of a specified size
> and mass.


With all due respect, I do not think you understand my logic or even
the basic issues I am trying to discuss.

Newtonian gravitation was a fine theory that made definitive
predictions. It was tested and found to fit observations exceedingly
well, but not exactly. Einstein discovered the reason for the
discrepancy and gave us a new theory of gravitation, which is even more
wonderful than Newtonian gravitation and has been tested thoroughly.
Some day a further refinement, perhaps one that finally brings in EM
and/or QM, will probably expand, amend or supersede GR.

We need to think a bit more about falsifiability. Without
falsifiability we would be stuck with Newtonian gravitation forever,
oblivious to what is really going on in nature. The philosophy of
science, or call it the science of science if you do not like the word
philosophy, cannot be overlooked without very unhappy consequences.
Need I say more than cosmic strings, magnetic monopoles, landscapes,
anthropic principles, dowsing,...

Rob

rlold...@amherst.edu

unread,
Oct 20, 2006, 12:46:19 PM10/20/06
to
Joseph Lazio wrote:
> >>>>> "re" == rloldershaw@amherst edu <rlold...@amherst.edu> writes:
>
> [Regarding predictions of the Big Bang model]
>
> re> 3. Abundances of light elements: definitely not predicted! We had
> re> good approximate abundances prior to any BB paradigm. Also, the
> re> theoretical abundances have been repeatedly revised as the
> re> observational situation has changed, especially with helium,
> re> deuterium and lithium. Even today, articles appear which question
> re> how well the BB paradigm is able to retrodict these abundances.
..

>
> Worrying about whether the actual abundance values were measured
> before or after the development of the Big Bang model itself misses
> the point.


No, my friend and colleague, it is not I who is missing the point.

There are very important differences between "model-building" science
like the Big Bang paradigm and theories of principle that are
rigorously falsifiable like General Relativity.

There are also important differences between retrodictions (as in the
case of light element abundances) and true definitive predictions (many
have been generated and vindicated by GR).

If we grant model-building the same status as theories of principle,
and if we continue to confuse retrodictions and definitive predictions,
then science will continue to suffer. We badly need new theories of
principle to guide us. The discrete fractal paradigm has the potential
to do this.

Rob

Oh No

unread,
Oct 21, 2006, 8:22:10 AM10/21/06
to
Thus spake "rlold...@amherst.edu" <rlold...@amherst.edu>

>
>At any rate, let us see a prediction by the BB paradigm about something
>that is currently unknown, but knowable in the foreseeable future. Can
>the BB paradigm do this, or not?

I don't think it right to talk about the big bang paradigm in this way,
as though it were somehow separate from the general theory of
relativity. The question should therefore be one of experimental tests
of general relativity, and of course all of these have come out
successfully for general relativity, and not so successfully for other
models. One to be done in the next few years is the gravity probe B
experiment, with which it is hoped to measure the frame dragging
prediction in a satellite orbiting the earth.

Experimental tests which general relativity does not fulfil so well
include Pioneer acceleration, MOND, lensing profiles, and galaxy ageing,
and of course unification is a theoretical test which also likely to
require a modification to gtr (I think the teleconnection). But even if
these do necessitate a modification to gtr, they in no way refute the
necessity for a big bang.

rlold...@amherst.edu

unread,
Oct 21, 2006, 8:20:39 AM10/21/06
to
Oh No wrote:

> The large scale distribution of galaxy clusters is also observed to be
> extremely homogeneous. But actually, what would be really difficult is
> justifying the formulation of a theory which did not obey the
> cosmological principle.

The Perfect Cosmological Principle is probably a myth. A good
discussion of strong and weak versions of the cosmological principle
can be found in Mandelbrot's The Fractal Geometry of Nature, and
subsequent journal papers by him. Within the fractal paradigm the
degree of homogeneity and adherence to the CP are a function of the
particular scales you are sampling, as is the case on atomic, stellar
and galactic scales. The fractal paradigm says nature does not
mysteriously switch to ideal homogeneity and a perfect CP at just about
the scale that our observational capabilities peter out. Rather, the
fractal paradigm says what we have observed from subatomic to atomic to
stellar to galactic to supercluster scales probably continues to higher
scales. The latter idea seems much more natural to me, and more
consistent with observations.


>
> I think, notwithstanding possible anomalies, the microwave background is
> such a test. It's overall character is well established and in
> accordance with a big bang. Only some of the detail may need attention.

This is why I consider the BB to be a good approximation to the physics
of the local Hubble Bubble, and I think it will remain so even after it
is superseded by some new paradigm.


>
> Another is the proton-neutron balance. This can be precisely calculated
> as a product of big bang nuclear synthesis, and depends quite critically
> on Hubble's constant. Observation and prediction fit extremely well.
>

That looks like a model-building retrodiction to me, and you know my
lack of patience with this horrendous and unscientific error of
treating retrodictions as true predictions.

I am still waiting for a definitive prediction of something by the BB
paradigm that has not been previously decided to a first approximation
- a genuine, and new, and definitive prediction. Something that can in
principle be falsified. If the Big Bang paradigm is so compelling, why
can it not do this?

> But there are simpler and more obvious ones. Without a big bang we would
> be caught up in Olber's paradox.

This is a red herring, as demonstrated by Charlier, de Vaucouleurs,
Mandelbrot and many others. The fractal paradigm with, or even
without, a local expansionary event does not necessarily run afoul of
Obler's paradox. Global expansion of the Hubble Bubble is consistent
with several radically different cosmological paradigms.


>
> A big bang is the most natural solution of Einstein's field equation
> (the same is actually true in Newtonian gravity).

> Einstein's theory of general relativity has also been put to a number of
> rigorous experimental tests, and has passed all of them while
> alternative models have failed. One must accept that any model must form
> a self consistent mathematical whole. One cannot accept parts of it and
> then accept other things which are mathematically inconsistent. It seems
> to me that any scientific model must satisfy this criterion even before
> one applies observational tests. In that regard, a big bang obeying
> general relativity has no serious competitors. One can hardly say this
> is not the result of rigorous scientific quantititive testing.
>

I have a lot of confidence in GR as a highly accurate theory of
gravitation. I think the Big Bang paradigm has been a good
approximation that has carried us forward a long way. However it has
been getting a bit obese lately with all the epicycles that have had to
be added to it, such as inflation, CDM, dark energy, contortions needed
to explain the CMB anomalies, endless adjusting to fit observations,
etc.

If the Big Bang paradigm gets the dark matter question wrong, and I
mean wrong in a BIG way, then we will know that it is time to look at
the universe with an open mind and consider radical alternatives that
preserve what is good about the BB and cast out the Ptolemaic
epicycles.

Rob

George Dishman

unread,
Oct 21, 2006, 8:23:31 AM10/21/06
to
<rlold...@amherst.edu> wrote in message
news:mt2.0-20386...@hercules.herts.ac.uk...
....

> When are you going to stop beating around the bush and present
> quantitative predictions by which we can definitively test the BB
> paradigm, or admit that it is not falsifiable.

Of course it is falsifiable. If there were no CMBR, the
model would be in serious trouble and if the universe
were 99.9% hydrogen then our model of nucleogenesis
would be falsified.

For recent specific predictions, note that the angular
power spectrum of the CMBR was predicted before WMAP
using GR both with a cosmological constant and without
but including quintessence instead as the nature of
dark energy. the subsequent results are a better fit to
the cosmological constant than quintessence so GR made
a prediction and passed the test.

George

Oh No

unread,
Oct 21, 2006, 8:21:40 AM10/21/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
>I find myself largely agreeing with Oh No here, except for the
>following paragraph.
>
>Oh No wrote:
>> A big bang is the most natural solution of Einstein's field equation
>> (the same is actually true in Newtonian gravity).
>
>I think you must mean gtr. EFE (as first published) contained a
>cosmological constant for the explicit purpose of preventing any such
>global geometrical dynamism.

Indeed. And that constants only reason for inclusion in the theory was
to prevent this. As that was not based on know fact, it is what I call
theoretical prejudice.


>
>> The fact that
>> scientists, including Einstein himself, did not tend to adopt a big bang
>> model had to do with theoretical prejudice,
>
>That is unfair. Every generation is a product of its educational
>background. In Einstein's case that was in the 19th century, when the
>thermodynamics of steam engines was still the 'state-of-the-art' in the
>technological application pure physical principles.
>
>Einstein supressed global geometrical dynamism within his mathematical
>apparatus, in order to produce a general theory of relativity which was
>consistent with the then known facts about the universe.
>
>If that is prejudice, then we are certainly still guilty of it now.

Indeed, I am sure we are. Newton, in the Scholium to the Principia
describes absolute space, but acknowledges that it cannot be measured,
or that it can only be measured in approximation. He adopted it from
theoretical prejudice derived from his reading of the scriptures, not
from observable fact. Likewise Einstein adopted a belief in a universe
which proceeds from everlasting to everlasting. That was not known fact
from observation, but religious belief. Likewise now physicists find it
hard to dispense with the notion of a prior space-time manifold, but it
is just a feature of our equations not something which we can observe.
That is again not a known fact, but a theoretical prejudice. I have
formulated my model based on the idea that we should try theories in
which no prior manifold appears. But I have my own theoretical
prejudices too, which I aim to recognise as such. This include
principles like homogeneity, isotropy, the cosmological principle none
of which can actually be demonstrated.

More seriously perhaps, it includes a stipulation that nothing should
appear in equations of physics for which there is no underlying
mechanism. So I am not much impressed by the modern concept of quantum
fields as a fundamental element of theory, and nor do I think much of
the cosmological constant. I also deny the infinite in physics, which
includes a denial of the continuum. All that is theoretical prejudice.
But if I can produce a mathematical model of physics which fits all
that, and fits observation too, then I think it has a right to be taken
seriously as a scientific model, and subject to proper scientific
scrutiny and testing, both of a rigorous mathematical and observational
nature.


>
>Perhaps we would all be a little wiser if we learned from the lessons
>of history, by recognising that nothing ever really changes, in that
>respect.
>
>John Bell
>(Change John to Liberty to bypass anti-spam email filter)

Regards

George Dishman

unread,
Oct 21, 2006, 8:23:19 AM10/21/06
to
<rlold...@amherst.edu> wrote in message
news:mt2.0-20386...@hercules.herts.ac.uk...
....
> There are very important differences between "model-building" science
> like the Big Bang paradigm and theories of principle that are
> rigorously falsifiable like General Relativity.

That is an important point, people sometimes overlook
the distinction between theories and the solutions made
by applying them.

> There are also important differences between retrodictions (as in the
> case of light element abundances) and true definitive predictions (many
> have been generated and vindicated by GR).

The problem with nucleogenesis is that the theory made
a definitive prediction down to one free parameter, the
baryon density. However, there are a number of abundance
ratios that can be measured so if any one is determined
from observation then the remainder become predictions.

http://www.astro.ucla.edu/~wright/BBNS.html

"A single value of the baryon density fits 4 abundances
simultaneously. The fit is good but not perfect."

The page goes on to list the details.

George

Steve Willner

unread,
Oct 21, 2006, 8:21:55 AM10/21/06
to
In article <mt2.0-20386...@hercules.herts.ac.uk>,

"rlold...@amherst.edu" <rlold...@amherst.edu> writes:
> When are you going to stop beating around the bush and present
> quantitative predictions by which we can definitively test the BB
> paradigm, or admit that it is not falsifiable.

The BB model is certainly falsifiable. Blue shifts for distant
objects would do it, to name only the most obvious thing. For
"future measurements" -- if you think those are somehow more
important than existing ones -- a few years ago, we would have
pointed to the power spectrum of microwave background fluctuations,
which WMAP has now measured. Now I think we might point to the
polarization power spectrum. Can anyone provide an update on what
WMAP has seen and what it is likely to see in the rest of its
mission?

--
Steve Willner Phone 617-495-7123 swil...@cfa.harvard.edu
Cambridge, MA 02138 USA
(Please email your reply if you want to be sure I see it; include a
valid Reply-To address to receive an acknowledgement. Commercial
email may be sent to your ISP.)

Oh No

unread,
Oct 22, 2006, 4:30:24 AM10/22/06
to
Thus spake "rlold...@amherst.edu" <rlold...@amherst.edu>

>Oh No wrote:
>
>> The large scale distribution of galaxy clusters is also observed to be
>> extremely homogeneous. But actually, what would be really difficult is
>> justifying the formulation of a theory which did not obey the
>> cosmological principle.
>
>The Perfect Cosmological Principle is probably a myth. A good
>discussion of strong and weak versions of the cosmological principle
>can be found in Mandelbrot's The Fractal Geometry of Nature, and
>subsequent journal papers by him. Within the fractal paradigm the
>degree of homogeneity and adherence to the CP are a function of the
>particular scales you are sampling, as is the case on atomic, stellar
>and galactic scales. The fractal paradigm says nature does not
>mysteriously switch to ideal homogeneity and a perfect CP at just about
>the scale that our observational capabilities peter out. Rather, the
>fractal paradigm says what we have observed from subatomic to atomic to
>stellar to galactic to supercluster scales probably continues to higher
>scales. The latter idea seems much more natural to me, and more
>consistent with observations.

I don't think that is what cosmological principle says. I agree that
perfect homogeneity is not expected - indeed locally the distribution is
matter is not homogeneous, but the cosmological principle does not say
it should be. The cosmological principle simply says that fundamental
local laws of physics of physics are always and everywhere the same. It
does not even say what those laws are. I would expect the fractal
paradigm to obey it. If the laws of physics do obey some fractal
structure, then the laws of that structure should be the same
everywhere.

>> Another is the proton-neutron balance. This can be precisely calculated
>> as a product of big bang nuclear synthesis, and depends quite critically
>> on Hubble's constant. Observation and prediction fit extremely well.
>>
>That looks like a model-building retrodiction to me, and you know my
>lack of patience with this horrendous and unscientific error of
>treating retrodictions as true predictions.

I think if you were to look into it properly, you would find that it
contains elements of both prediction and retrodiction. It is not model
building, in the sense that it does not contain sufficient free
parameters to play with and fit any body of observation. Some parameters
are determined from observation, then others are determined deductively
and fit further observation.


>
>I am still waiting for a definitive prediction of something by the BB
>paradigm that has not been previously decided to a first approximation
>- a genuine, and new, and definitive prediction. Something that can in
>principle be falsified. If the Big Bang paradigm is so compelling, why
>can it not do this?

It does do this, and a number of examples have been produced. All one
means by a big bang is expansion from an initial state in which geometry
breaks down. One does not even have to take the initial singularity
predicted by general relativity as a true property of matter, since a
singularity can be taken evidence of the breakdown of general relativity
in the initial state..
>

>> A big bang is the most natural solution of Einstein's field equation
>> (the same is actually true in Newtonian gravity).

>> Einstein's theory of general relativity has also been put to a number of
>> rigorous experimental tests, and has passed all of them while
>> alternative models have failed. One must accept that any model must form
>> a self consistent mathematical whole. One cannot accept parts of it and
>> then accept other things which are mathematically inconsistent. It seems
>> to me that any scientific model must satisfy this criterion even before
>> one applies observational tests. In that regard, a big bang obeying
>> general relativity has no serious competitors. One can hardly say this
>> is not the result of rigorous scientific quantititive testing.
>>
>I have a lot of confidence in GR as a highly accurate theory of
>gravitation.

Then it should be taken as highly accurate at least from the time of big
bang nucleosynthesis.

> I think the Big Bang paradigm has been a good
>approximation that has carried us forward a long way. However it has
>been getting a bit obese lately with all the epicycles that have had to
>be added to it, such as inflation, CDM, dark energy, contortions needed
>to explain the CMB anomalies, endless adjusting to fit observations,
>etc.

I agree with you about add ons like inflation, CDM, dark energy, and the
CMB may not be fully described by our current model. However these are
not properties of the big bang. They are properties of one big bang
model, the currently favoured one. The teleconnection also yields a big
bang model, but it dispenses with inflation, CDM and dark energy. It may
or may not resolve CMB anomalies - the analysis is beyond me just now.

>If the Big Bang paradigm gets the dark matter question wrong, and I
>mean wrong in a BIG way,

The big bang on its own has nothing to say about dark matter. A big bang
is a theoretical prediction of a class of theories, not the name of the
currently accepted standard model. That is called the Concordance model,
not the big bang.

>then we will know that it is time to look at
>the universe with an open mind and consider radical alternatives that
>preserve what is good about the BB and cast out the Ptolemaic
>epicycles.

Certainly we should be looking at alternative cosmologies, and indeed
theorists do look at alternative cosmologies. Even without studying or
interpreting observational evidence the pursuit of a unified model of
quantum gravity is something of a holy grail, and it occupies a huge
amount of theoretical research. I happen to think most of that is
misguided - I don't think anything can come of string theory for
example, and I think loop quantum gravity is little more than a toy
model, only some of which is useful. I also think the add ons are
indication that there is a fundamental fault in physical theory. So long
as we have no unified theory, we know there is a fundamental fault
anyway. But when that fault is fixed and accepted, we will still have a
big bang.

Martin Hardcastle

unread,
Oct 22, 2006, 4:25:17 AM10/22/06
to
In article <mt2.0-30336...@hercules.herts.ac.uk>,

rlold...@amherst.edu <rlold...@amherst.edu> wrote:
>Please show me at least one definitive prediction, by which we might
>put the Big Bang paradigm to a rigorous, quantitative scientific test.

Since the Big Bang model just says that the universe was hotter and
denser in the past, the obvious predictions involve the mean density
of the universe and temperature/energy density of the CBR as a
function of redshift.

Martin
--
Martin Hardcastle
School of Physics, Astronomy and Mathematics, University of Hertfordshire, UK
Please replace the xxx.xxx.xxx in the header with herts.ac.uk to mail me

John (Liberty) Bell

unread,
Oct 22, 2006, 4:25:43 AM10/22/06
to
Oh No wrote:

> Without a big bang we would
> be caught up in Olber's paradox.

Wrong.

Prior to Big Bang theory, the GR curvature of spacetime had been
characrterised euphamisticlly as meaning that if you could see out all
the way, what you would see there is the back of your own head. Unless
you believe that even the back of your own head is bright enough to
illuminate the whole night sky, there would thus seem to be no Olber's
paradox in GR theory, with or without a Big Bang.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 22, 2006, 4:25:59 AM10/22/06
to

You are seriously misunderstanding or misinterpreting something here.
No serious cosmologist has ever claimed that the universe should be more
homogeneous than that the average density in volumes the size of several
galaxy clusters is roughly the same.

Of course, there are inhomogeneities in the CMB, which are directly
related to structure formation. If everything were EXACTLY homogeneous,
you wouldn't be here.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 22, 2006, 4:28:20 AM10/22/06
to

> Phillip Helbig---remove CLOTHES to reply wrote:
> > In article <mt2.0-32486...@hercules.herts.ac.uk>,
> > "rlold...@amherst.edu" <rlold...@amherst.edu> writes:
> >
> > > So, is your basic argument that the Big Bang paradigm is a
> > > one-size-fits-all model which can accommodate new observational
> > > discoveries by morphing and adding on epicycles without limits?
> >
> > Newtonian gravity allows a planet at any distance from the sun, unlike
> > Kepler's wrong geometrical conclusions. If that is "one size fits all",
> > then so is ANY theory with a free parameter. No epicycles, though.
>
> See my reply to Joseph Lazio (#19, I think).
>
> More importantly, why are we talking about Newtonian gravitation? The
> discussion is about cosmological paradigms and which ones can (the
> discrete fractal paradigm and the teleconnection model, so far) and
> which ones cannot (apparently the BB paradigm) make definitive
> predictions.

The same remark applies to GR. The point is that it is wrong to suggest
that a good theory must "predict everything". See John Barrow's
THEORIES OF EVERYTHING for a good explanation of what a theory of
everything cannot and should not predict.

> When are you going to stop beating around the bush and present
> quantitative predictions by which we can definitively test the BB
> paradigm, or admit that it is not falsifiable.

When you clearly define by what you mean by that paradigm.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 22, 2006, 1:45:41 PM10/22/06
to
In article <mt2.0-18088...@hercules.herts.ac.uk>, Oh No
<No...@charlesfrancis.wanadoo.co.uk> writes:

Normally, by the "cosmological principle" one means that the universe is
everywhere the same. This means a) that the laws are everywhere the
same and b) that (averaged over a large enough volume), the universe
LOOKS the same to observers there. The "perfect cosmological principle"
extends this to all times as well. (And makes a definite prediction:
the steady-state model. Note that in the steady-state model the LAWS
are the same for all times; it is the additional requirement that the
universe LOOKS THE SAME at all times which leads to the steady-state
model as contrasted with, say, models based on GR.)

Alf P. Steinbach

unread,
Oct 22, 2006, 1:43:42 PM10/22/06
to
* John (Liberty) Bell:

> Oh No wrote:
>
>> Without a big bang we would
>> be caught up in Olber's paradox.
>
> Wrong.
>
> Prior to Big Bang theory, the GR curvature of spacetime had been
> characrterised euphamisticlly as meaning that if you could see out all
> the way, what you would see there is the back of your own head. Unless
> you believe that even the back of your own head is bright enough to
> illuminate the whole night sky, there would thus seem to be no Olber's
> paradox in GR theory, with or without a Big Bang.

Consider, as a gedanken experiment, a solar system size non-expanding
universe with /one/ star. It would get rather hot and bright in there
after a while. Wouldn't it?

The problem with the see-back-of-head argument is that for a
sufficiently large non-expanding universe, you'd instead most likely see
a patch of the surface of a star, or something in turn heated by star's
radiation, in any chosen direction, and for an expanding universe,
there's cooling: it's the ultimate freezer.

But the non-expansion is just one of several assumptions that Olber's
paradox relies on:

A Practically infinite "steady state" time (back in history), that for
a long enough period of time there have been stars, so that their
light can have reached us.

B A largely homogenous distribution of galaxies at large enough
scales, not e.g. hierarchical/fractal or (just in principle, is
contrary to observations) limited to a plane, or...

C No other possible significant escape of radiation (cooling
mechanism), e.g. conversion to matter or into other dimensions
(which is the plane from (B) but with a twist) or via
expansion combined with matter production or whatever mechanism.

Assumption (B) is perhaps the most crucial one in a Big Bang discussion:
given that at all scales so far observed the universe is hierarchical,
it's as far as I understand it only the Big Bang theory itself that
causes one to believe in homogeneity at even larger scales -- which as
an argument for Big Bang based on Olber's paradox is circular reasoning
(given that BB is true we must have that homogeneity at large scales,
and given that, BB must be true, lest we end up with Olber's paradox).

--
A: Because it messes up the order in which people normally read text.
Q: Why is it such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?

Oh No

unread,
Oct 22, 2006, 1:44:33 PM10/22/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
>Oh No wrote:
>
>> Without a big bang we would
>> be caught up in Olber's paradox.
>
>Wrong.
>
>Prior to Big Bang theory, the GR curvature of spacetime had been
>characrterised euphamisticlly as meaning that if you could see out all
>the way, what you would see there is the back of your own head. Unless
>you believe that even the back of your own head is bright enough to
>illuminate the whole night sky, there would thus seem to be no Olber's
>paradox in GR theory, with or without a Big Bang.
>
>

Fair point. In the Einstein static universe there would be a finite
amount of radiating matter. The two problems being that the Einstein
static universe is contradicted by the observation of Hubble expansion,
and that it turned out not to be stable, and is not a valid model of
physics for that reason.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 22, 2006, 1:50:00 PM10/22/06
to
In article <mt2.0-18088...@hercules.herts.ac.uk>, "John
(Liberty) Bell" <john...@accelerators.co.uk> writes:

> Oh No wrote:
>
> > Without a big bang we would
> > be caught up in Olber's paradox.
>
> Wrong.
>
> Prior to Big Bang theory, the GR curvature of spacetime had been
> characrterised euphamisticlly as meaning that if you could see out all
> the way, what you would see there is the back of your own head.

Or, looking a bit lower, perhaps Uranus (use American pronunciation
here). :-)

> Unless
> you believe that even the back of your own head is bright enough to
> illuminate the whole night sky, there would thus seem to be no Olber's
> paradox in GR theory, with or without a Big Bang.

This is wrong. You are in good company, though. Pascal Jordan believed
this, and as a result developed his own cosmological model. In such a
universe, every line of sight would eventually hit a star, and after a
time things would heat up to stellar temperatures.

The resolution of Olbers's paradox is that the universe is not yet old
enough for enough light to have reached us.

See Edward Harrison's excellent textbook COSMOLOGY: THE SCIENCE OF THE
UNIVERSE and its chapter "Darkness at Night".

Jonathan Thornburg -- remove -animal to reply

unread,
Oct 22, 2006, 1:45:29 PM10/22/06
to
In article <mt2.0-20386...@hercules.herts.ac.uk>,
"rlold...@amherst.edu" <rlold...@amherst.edu> writes:
> When are you going to stop beating around the bush and present
> quantitative predictions by which we can definitively test the BB
> paradigm, or admit that it is not falsifiable.

Steve Willner pointed out


> The BB model is certainly falsifiable. Blue shifts for distant

> objects would do it, to name only the most obvious thing. [[...]]

Another example-and-a-half of an observation(s) which would falsify
the BB model would be an in-situ measurement of a CMBR temperature
significantly *less* than the present-day value (2.73K) either in
some distant (& hence old) object, or (via some as-yet-unknown
measurement technique) at some time in our own past.

That is,
(1) By careful spectroscopic observations of some distant astronomical
objects, we can infer the relative occupancies of different energy
levels of certain atoms/molecules, and hence infer properties of
the (CMBR) radiation field which (we infer) excited them.
[See, eg, Srianand, Petitjean & Ledoux, Nature 408, 931 (2000),
"The cosmic microwave background radiation temperature at
a redshift of 2.34".]
The BB model predicts that this temperature must be *higher*
than the present-day CMBR temperature; if the observations were to
come out *lower* I don't see how this could be reconciled with the
BB model.
(2) By laboratory measurements of isotope abundances in Uranium ores,
we infer the operation of a natural nuclear reactor
http://en.wikipedia.org/wiki/Natural_nuclear_fission_reactor
around 1.5e9 years near Oklo, Gabon. So far as I know the CMBR
temperature at the time didn't significantly affect the Oklo reactor,
so there's no way to use the Oklo data to study the time variation
of the CMBR temperature at our location in the universe.

However, if some (as-yet-undiscovered, or at least unknown-to-me-now)
technique were to be developed to allow this time variation to be
measured, the BB model predicts that the past temperature must be
*higher* than the present-day CMBR temperature; I don't see how a
*lower* measurement could be reconciled with the BB model.

[Just to be clear, the conceptual difference between (1) and (2) is that
(1) is measuring the CMBR temperature in the past at a location in the
universe distant from ours, while (2) is (imagining) measuring the CMBR
temperature in the past at *our* location location in the universe.]

ciao,

--
-- "Jonathan Thornburg -- remove -animal to reply" <jth...@aei.mpg-zebra.de>
Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut),
Golm, Germany, "Old Europe" http://www.aei.mpg.de/~jthorn/home.html
"Washing one's hands of the conflict between the powerful and the
powerless means to side with the powerful, not to be neutral."
-- quote by Freire / poster by Oxfam

rlold...@amherst.edu

unread,
Oct 22, 2006, 1:53:59 PM10/22/06
to
George Dishman wrote:
> Of course it is falsifiable. If there were no CMBR, the
> model would be in serious trouble and if the universe
> were 99.9% hydrogen then our model of nucleogenesis
> would be falsified.

I mean falsified in a future definitive test. Predictions must be made
prior to observations in a definitive test. What sense does it make to
say "if there were no CMBR..." and "if the universe were 99.9%
hydrogen"? Think of moving forward. How can we test the BB paradigm
vs the discrete fractal paradigm NOW?


>
> For recent specific predictions, note that the angular
> power spectrum of the CMBR was predicted before WMAP
> using GR both with a cosmological constant and without
> but including quintessence instead as the nature of
> dark energy. the subsequent results are a better fit to
> the cosmological constant than quintessence so GR made
> a prediction and passed the test.

This is mostly model-building, i.e., adding cosmological constants,
dark energy, quintesssence, etc. Want to throw in a few magnetic
monopoles? The details of the CMB seem to change regularly, as does the
"one-size-fits-all "precision cosmology".

Robert

rlold...@amherst.edu

unread,
Oct 22, 2006, 1:54:43 PM10/22/06
to
George Dishman wrote:
> The problem with nucleogenesis is that the theory made
> a definitive prediction down to one free parameter, the
> baryon density. However, there are a number of abundance
> ratios that can be measured so if any one is determined
> from observation then the remainder become predictions.
>
> http://www.astro.ucla.edu/~wright/BBNS.html
>
> "A single value of the baryon density fits 4 abundances
> simultaneously. The fit is good but not perfect."

I have seen people claim that the hard-won agreement between
nucleosynthesis data and theory nearly proves the standard model must
be totally right. I have also seen other papers that make it seem more
like an approximate "fit", with significant anomalies, even after
hundreds of people over decades have tinkered, and tinkered, and
tinkered.

Successfully predict the true nature of the dark matter and that will
be more impressive. In fact it would be crystal clear.

Robert

rlold...@amherst.edu

unread,
Oct 22, 2006, 1:55:05 PM10/22/06
to
Steve Willner wrote:
> The BB model is certainly falsifiable. Blue shifts for distant
> objects would do it, to name only the most obvious thing. For
> "future measurements" -- if you think those are somehow more
> important than existing ones -- a few years ago, we would have
> pointed to the power spectrum of microwave background fluctuations,
> which WMAP has now measured. Now I think we might point to the
> polarization power spectrum. Can anyone provide an update on what
> WMAP has seen and what it is likely to see in the rest of its
> mission?

I fear that no matter what the properties of the CMB turn out to be:
anomalous quadrupole or octopole properties, too much power on scale x,
too little power on scale y, strong polarization, weak polarization,
etc., the standard "precision cosmology" model is so plastic that it
will be modified to "fit" the data. After a barely decent interval,
proponents will then say, "See, just what we predicted!".

This is the danger of model-building without definitive tests. It is
virtually impossible to falsify and important new ideas get ignored
because the "standard" model is widely regarded as the only reasonable
path forward.

Robert

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 22, 2006, 5:00:27 PM10/22/06
to
In article <mt2.0-18097...@hercules.herts.ac.uk>, "Alf P.
Steinbach" <al...@start.no> writes:

> * John (Liberty) Bell:
> > Oh No wrote:
> >
> >> Without a big bang we would
> >> be caught up in Olber's paradox.
> >
> > Wrong.
> >
> > Prior to Big Bang theory, the GR curvature of spacetime had been
> > characrterised euphamisticlly as meaning that if you could see out all
> > the way, what you would see there is the back of your own head. Unless
> > you believe that even the back of your own head is bright enough to
> > illuminate the whole night sky, there would thus seem to be no Olber's
> > paradox in GR theory, with or without a Big Bang.
>
> Consider, as a gedanken experiment, a solar system size non-expanding
> universe with /one/ star. It would get rather hot and bright in there
> after a while. Wouldn't it?

Indeed.

> The problem with the see-back-of-head argument is that for a
> sufficiently large non-expanding universe, you'd instead most likely see
> a patch of the surface of a star, or something in turn heated by star's
> radiation, in any chosen direction,

Right.

> and for an expanding universe,
> there's cooling: it's the ultimate freezer.

Yes, but this expansion effect is not enough in our universe to solve
the paradox.

> But the non-expansion is just one of several assumptions that Olber's
> paradox relies on:
>
> A Practically infinite "steady state" time (back in history), that for
> a long enough period of time there have been stars, so that their
> light can have reached us.

This is how it is solved in our universe.

> B A largely homogenous distribution of galaxies at large enough
> scales, not e.g. hierarchical/fractal or (just in principle, is
> contrary to observations) limited to a plane, or...

Right---possible solution in principle, observations are against it.

> Assumption (B) is perhaps the most crucial one in a Big Bang discussion:
> given that at all scales so far observed the universe is hierarchical,

This is simply not true. At scales larger than galaxy clusters (still
much smaller than the observable universe), one DOES have homogeneity.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 22, 2006, 5:00:55 PM10/22/06
to
In article <mt2.0-18097...@hercules.herts.ac.uk>, Oh No
<No...@charlesfrancis.wanadoo.co.uk> writes:

> In the Einstein static universe there would be a finite
> amount of radiating matter. The two problems being that the Einstein
> static universe is contradicted by the observation of Hubble expansion,
> and that it turned out not to be stable, and is not a valid model of
> physics for that reason.

It's ruled out by observation; we agree on that. I think it is
interesting that the Einstein-de Sitter universe is ALSO an unstable
fixed point, and no-one used that as an argument against it. (Though
that argument, in a roundabout way, WAS used to justify inflation,
saying that because of this the universe can't be just near Einstein-de
Sitter, but must be VERY near it. I don't buy this argument, but that
has been discussed here at length.)

Oh No

unread,
Oct 23, 2006, 7:28:36 AM10/23/06
to
Thus spake Phillip Helbig---remove CLOTHES to reply <hel...@astro.multiC
LOTHESvax.de>
>In article <mt2.0-18088...@hercules.herts.ac.uk>, Oh No

Thanks for pointing this out. I confess it strikes me as a little odd.
Really these are two distinct principles, and not only that but they are
principles of quite a different character. a) is a principle concerning
a fundamental principle of physical law, and b) is a statement about
matter distribution, normally summed up as homogeneity and isotropy.

>The "perfect cosmological principle"
>extends this to all times as well. (And makes a definite prediction:
>the steady-state model. Note that in the steady-state model the LAWS
>are the same for all times; it is the additional requirement that the
>universe LOOKS THE SAME at all times which leads to the steady-state
>model as contrasted with, say, models based on GR.)

I am not so interested in principles which I don't think are properties
of nature. The principle I want to express is that the fundamental
behaviour of matter is always and everywhere the same. From this I infer
infer the principle of general relativity, local laws of physics are the
same irrespective of the observer.

The former principle makes no mention of an observer, so I have not
thought it right to call it an expression on the general principle of
relativity, but as a fundamental principle from which it is possible to
formulate physical law it seems to me it ought to have a name. Any
ideas? Perhaps it is just an expression of the general principle of
relativity.

rlold...@amherst.edu

unread,
Oct 23, 2006, 8:08:47 AM10/23/06
to
Phillip Helbig---remove CLOTHES to reply wrote:

> Of course, there are inhomogeneities in the CMB, which are directly
> related to structure formation. If everything were EXACTLY homogeneous,

> I wouldn't be here.

So, perhaps authors need to be careful with their terminology. When
the unqualified term "homogeneity" is repeated thousands of times, the
less discerning members of our community, as well as the general
public, could actually begin to believe in such over-idealizations.

rlold...@amherst.edu

unread,
Oct 23, 2006, 8:06:47 AM10/23/06
to
Martin Hardcastle wrote:

>
> Since the Big Bang model just says that the universe was hotter and
> denser in the past, the obvious predictions involve the mean density
> of the universe and temperature/energy density of the CBR as a
> function of redshift.
>

I generally agree with these reasonable statements about what has been
demonstrated with the Big Bang model. I also agree that these
conclusions have undergone significant scientific testing, and have
passed those early tests.

Rob

rlold...@amherst.edu

unread,
Oct 23, 2006, 8:17:21 AM10/23/06
to
Phillip Helbig---remove CLOTHES to reply wrote:

> The same remark applies to GR. The point is that it is wrong to suggest
> that a good theory must "predict everything".

Well, then how about you just predict "something".

> When you clearly define by what you mean by that paradigm.

Define the BB paradigm any way you prefer, and then use your model to
make a new prediction that is prior, quantitative, testable and
non-adjustable.

rlold...@amherst.edu

unread,
Oct 23, 2006, 8:20:25 AM10/23/06
to
Phillip Helbig---remove CLOTHES to reply wrote:

> > Assumption (B) is perhaps the most crucial one in a Big Bang discussion:
> > given that at all scales so far observed the universe is hierarchical,
>
> This is simply not true. At scales larger than galaxy clusters (still
> much smaller than the observable universe), one DOES have homogeneity.

Well, there you go again using the unqualified H-word!

Why not say: "have homogeneity at the x level", or "have approximate
homogeneity", or perhaps best: "appear to have statistical homogeneity
over the x to y range of scales"?

Such differences in science are not mere semantics!

rlold...@amherst.edu

unread,
Oct 23, 2006, 8:41:46 AM10/23/06
to
Phillip Helbig---remove CLOTHES to reply wrote:

> Or, looking a bit lower, perhaps Uranus (use American pronunciation
> here). :-)

To paraphrase Pauli: 'Not even funny'.

> The resolution of Olbers's paradox is that the universe is not yet old
> enough for enough light to have reached us.

There is more than one way to avoid Obler's paradox. Let us not forget
that.

George Dishman

unread,
Oct 24, 2006, 3:50:07 AM10/24/06
to
<rlold...@amherst.edu> wrote in message
news:mt2.0-22445...@hercules.herts.ac.uk...

If N is the number of galaxies within a cube of
side length l:

dN/<N> = 0.5 +/- 0.1 at a box width l = 30h^-1 Mpc

where dN is the standard deviation of N and <N> is
the mean of N.

Saunders et al 1991 and Efstathiou 1991

Quoted from Peebles "Principals of Physical Cosmology",
eqn 3.24.

I have sometimes wondered how dN/<N> would vary
as a function of l and I guess I should know but
it's too long since I did any statistics. Anyone
care to put me out of my misery?

George

geo...@briar.demon.co.uk

unread,
Oct 24, 2006, 5:02:46 AM10/24/06
to
I think this is getting outside the charter for the
group so I've cross-posted and set follow-ups
to sci.astro.

rlold...@amherst.edu wrote:
> George Dishman wrote:
> > Of course it is falsifiable. If there were no CMBR, the
> > model would be in serious trouble and if the universe
> > were 99.9% hydrogen then our model of nucleogenesis
> > would be falsified.
>
> I mean falsified in a future definitive test.

By that approach if I make a prediction today
and it is borne out by measurement tomorrow
that is great, but the day after it doesn't count
because it is then in the past. The big bang
model has made many predictions that have
been confirmed, some of which have been
mentioned in these threads. The fact that they
were successfully done some time ago doesn't
diminish their significance.

> Predictions must be made
> prior to observations in a definitive test. What sense does it make to
> say "if there were no CMBR..." and "if the universe were 99.9%
> hydrogen"?

The first makes sense because the prediction _was_
made prior to the measurement, other were already
looking when Penzias and Wilson got lucky. The
second makes sense because the prediction is
not based on a free parameter that can be adjusted
to get that value whle still meeting other observations.
The term "prediction" doesn't just refer to whether the
value was calculated prior to its measurement but can
also mean that it is derivable without fitting.

> Think of moving forward. How can we test the BB paradigm
> vs the discrete fractal paradigm NOW?

I already answered that, the most obvious test to do
is the angular power spectrum. Where is the fractal
prediction? You complain about the big bang model
not making predictions yet you ignore the fact that
you cannot provide that key prediction from your
theory.

However, let me correct another point. You don't test
one theory against another in science, you test each
against observation. Your theory suggests _all_ dark
matter should be in MACHOs, does it not? If so then
the microlensing evidence is strongly against it.

> > For recent specific predictions, note that the angular
> > power spectrum of the CMBR was predicted before WMAP
> > using GR both with a cosmological constant and without
> > but including quintessence instead as the nature of
> > dark energy. the subsequent results are a better fit to
> > the cosmological constant than quintessence so GR made
> > a prediction and passed the test.
>

> This is mostly model-building, ...

Nope, the predictions were made before the results
were known, not fitted retrospectively. GR passed
the test, you have not yet even provided a prediction.

George

geo...@briar.demon.co.uk

unread,
Oct 24, 2006, 10:04:57 AM10/24/06
to
George Dishman wrote:
....

> If N is the number of galaxies within a cube of
> side length l:
>
> dN/<N> = 0.5 +/- 0.1 at a box width l = 30h^-1 Mpc
>
> where dN is the standard deviation of N and <N> is
> the mean of N.
>
> Saunders et al 1991 and Efstathiou 1991
>
> Quoted from Peebles "Principals of Physical Cosmology",
> eqn 3.24.
>
> I have sometimes wondered how dN/<N> would vary
> as a function of l and I guess I should know but
> it's too long since I did any statistics. Anyone
> care to put me out of my misery?

I think the answer to that should be

dN/<N> = k / l^(3/2)

for a homogenous universe, but corrections will be
appreciated if I'm wrong.

I wonder what Rob's fractal theory predicts.

George

John (Liberty) Bell

unread,
Oct 27, 2006, 4:15:29 AM10/27/06
to
Alf P. Steinbach wrote:
> * John (Liberty) Bell:
> > Oh No wrote:
> >
> >> Without a big bang we would
> >> be caught up in Olber's paradox.
> >
> > Wrong.
> >
> > Prior to Big Bang theory, the GR curvature of spacetime had been
> > characrterised euphamisticlly as meaning that if you could see out all
> > the way, what you would see there is the back of your own head. Unless
> > you believe that even the back of your own head is bright enough to
> > illuminate the whole night sky, there would thus seem to be no Olber's
> > paradox in GR theory, with or without a Big Bang.

One important feature of this back of head characterisation is that it
implies a maximum lookback time of one trip across the visible
universe. This feature was reproduced in the Big Bang version of GR,
and in Philip Helbig's alternative (comedy) steady state
interpretation.

(Main differences being, in the first case the short cut route between
original source and final destination traverses your brain, in the
second case, unknown, and in the third case, to stretch a pun, ur GUT ;
)

> Consider, as a gedanken experiment, a solar system size non-expanding
> universe with /one/ star.

Shall we say, for sake of argument, of ~ 72 AU extent?

> It would get rather hot and bright in there
> after a while. Wouldn't it?

Not necessarily, since the visible universe would then have a maximum
duration of ~ 12 hours.

Although there could arguably be a paradox if we attempt to probe
beyond that point, the same is true in the BB version, at time zero.

> The problem with the see-back-of-head argument is that for a
> sufficiently large non-expanding universe, you'd instead most likely see
> a patch of the surface of a star,

Not likely. If the traced back light beams did not converge again to
the same point you started at, they would most likely converge at a
point in empty space or, still more likely, a fuzzy ill defined volume
of spacetime, the size of a quantum fluctuation. (Sound familiar, in
the context of BBT?)

> and for an expanding universe,
> there's cooling: it's the ultimate freezer.

And the ultimate paradox (See eg start of Ch 44, MTW)

Here's hoping that this follow-up response generates equivalent
interest (and amusement)

John Bell
http://global.accelerators.co.uk

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 27, 2006, 12:08:08 PM10/27/06
to
In article <mt2.0-5784...@hercules.herts.ac.uk>, "John (Liberty)
Bell" <john...@accelerators.co.uk> writes:

> > > Prior to Big Bang theory, the GR curvature of spacetime had been
> > > characrterised euphamisticlly as meaning that if you could see out all
> > > the way, what you would see there is the back of your own head. Unless
> > > you believe that even the back of your own head is bright enough to
> > > illuminate the whole night sky, there would thus seem to be no Olber's
> > > paradox in GR theory, with or without a Big Bang.
>
> One important feature of this back of head characterisation is that it
> implies a maximum lookback time of one trip across the visible
> universe. This feature was reproduced in the Big Bang version of GR,
> and in Philip Helbig's alternative (comedy) steady state
> interpretation.

If I recall correctly, in order for light to be able to circumnavigate
the universe, the cosmological constant has to be sufficiently positive.
Otherwise, the universe will either recollapse before there is time
(zero or negative cosmological constant) or expand too quickly (positive
but not large enough cosmological constant).

rlold...@amherst.edu

unread,
Oct 27, 2006, 2:38:57 PM10/27/06
to
Phillip Helbig---remove CLOTHES to reply wrote:

Sometimes I wonder if Obler's Paradox has led to more clarification or
more confusion in cosmology. Maybe it is a poorly posed paradox with
several possible solutions, and in the final analysis not very useful.
I could be totally wrong in this largely intuitive opinion, but I throw
it out there anyway.

Of more possible interest is the recent paper with M.S. Turner as one
of the authors in ApJ, 649, 563-569, 2006. The Hubble Bubble appears
to have inflated, decelerated, accelerated and is now in low or
non-accelerating phase. None of this slow-motion, smooth expansion,
which has always reminded me of Kurt Vonnegut's Grand AHHWWOOOOOMMMM!
in Cat's Cradle.

No it looks like a very much more turbulent global expansion, and one
that is more in keeping with the plasma-like filament/void/sheet
structure seen in SN remnants and in the large-scale distribution of
actual galaxies and galactic clusters. The paper cited above is an
interesting, and possibly important, paper.

Robert L. Oldershaw

John (Liberty) Bell

unread,
Oct 29, 2006, 5:05:43 AM10/29/06
to
Phillip Helbig---remove CLOTHES to reply wrote:
> In article <mt2.0-5784...@hercules.herts.ac.uk>, "John (Liberty)
> Bell" <john...@accelerators.co.uk> writes:
>
> > > > Prior to Big Bang theory, the GR curvature of spacetime had been
> > > > characrterised euphamisticlly as meaning that if you could see out all
> > > > the way, what you would see there is the back of your own head. Unless
> > > > you believe that even the back of your own head is bright enough to
> > > > illuminate the whole night sky, there would thus seem to be no Olber's
> > > > paradox in GR theory, with or without a Big Bang.
> >
> > One important feature of this back of head characterisation is that it
> > implies a maximum lookback time of one trip across the visible
> > universe. This feature was reproduced in the Big Bang version of GR,
> > and in Philip Helbig's alternative (comedy) steady state
> > interpretation.
>
> If I recall correctly, in order for light to be able to circumnavigate
> the universe, the cosmological constant has to be sufficiently positive.
> Otherwise, the universe will either recollapse before there is time
> (zero or negative cosmological constant) or expand too quickly (positive
> but not large enough cosmological constant).

I am no expert on the CC, since it had already been discredited, by no
less an authority than Einstein, well before I got out of short
trousers. However, it seems to me, at an intuitive level, that this
should have read: (positive but too large cosmological constant).

If I recall correctly, in the pre- BB model, the CC had to be "just
right" to prevent collapse or expansion, a bit like baby's porridge in
the story of Goldilocks and the three bears. Now that it has been
resurrected, it seems the porridge has to be "just right" again, this
time to produce a credible level of accelerating expansion.

JB

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 29, 2006, 1:16:50 PM10/29/06
to
In article <mt2.0-14405...@hercules.herts.ac.uk>, "John
(Liberty) Bell" <john...@accelerators.co.uk> writes:

> > If I recall correctly, in order for light to be able to circumnavigate
> > the universe, the cosmological constant has to be sufficiently positive.
> > Otherwise, the universe will either recollapse before there is time
> > (zero or negative cosmological constant) or expand too quickly (positive
> > but not large enough cosmological constant).
>
> I am no expert on the CC, since it had already been discredited, by no
> less an authority than Einstein, well before I got out of short
> trousers. However, it seems to me, at an intuitive level, that this
> should have read: (positive but too large cosmological constant).

Actually, if it is too large it will expand too quickly, as you say,
but also if it is too small the universe will just recollapse as if it
were 0 or negative. It has to be near a critical value for this
circumnavigation to take place.

Discredited? Not really. Einstein thought it unnecessary after it was
discovered that the universe is not static, and perhaps wished he could
have predicted the expansion (or contraction). Mathematically or
physically, though, Einstein did nothing to discredit it.

Although Einstein wrote a lot of stuff, much of which is preserved, it
is interesting that the "biggest blunder" comment is known ONLY through
an anecdote from Gamow (who was something of a jokester and certainly
not above embellishing things), although if it really was the biggest
blunder one could think Einstein would have mentioned this more often.

His later rejection of the cosmological constant is often cited as a
reason to discredit it. However, most of the stuff Einstein worked on
in his last 40 years essentially led to nothing. If he was wrong there,
he could be wrong about the cosmological constant.

> If I recall correctly, in the pre- BB model, the CC had to be "just
> right" to prevent collapse or expansion, a bit like baby's porridge in
> the story of Goldilocks and the three bears.

Yes. However, with the Einstein-de Sitter model, it also has to be
"just right". A model near, but not exactly, at this point will move
farther and farther away, just as is the case with the Einstein
universe. Of course, the Einstein universe is ruled out because it is
static, but this additional argument of instability was, as far as I
know, never used to rule out the Einstein-de Sitter model.

> Now that it has been
> resurrected, it seems the porridge has to be "just right" again, this
> time to produce a credible level of accelerating expansion.

True to some extent, but there is an obvious selection effect here: too
small (i.e. negative) and the universe recollapses quickly, too large
and it expands before structure can form. So this might be just a case
of the trivial weak anthropic principle at work.

Hans Aberg

unread,
Oct 30, 2006, 3:37:24 AM10/30/06
to
In article <mt2.0-766-...@hercules.herts.ac.uk>,
hel...@astro.multiCLOTHESvax.de (Phillip Helbig---remove CLOTHES to reply)
wrote:

> Tthe cosmological constant]


> Discredited? Not really. Einstein thought it unnecessary after it was
> discovered that the universe is not static, and perhaps wished he could
> have predicted the expansion (or contraction). Mathematically or
> physically, though, Einstein did nothing to discredit it.

A funny thing with it, though, is that, unlike the other components of the
Einstein-Hilbert equation, it does not come from a Lagrangian.

--
Hans Aberg

Oh No

unread,
Oct 30, 2006, 3:37:08 AM10/30/06
to
Thus spake Phillip Helbig---remove CLOTHES to reply <hel...@astro.multiC
LOTHESvax.de>

>In article <mt2.0-14405...@hercules.herts.ac.uk>, "John
>(Liberty) Bell" <john...@accelerators.co.uk> writes:
>
>> I am no expert on the CC, since it had already been discredited,

>


>Discredited? Not really. Einstein thought it unnecessary after it was
>discovered that the universe is not static, and perhaps wished he could
>have predicted the expansion (or contraction). Mathematically or
>physically, though, Einstein did nothing to discredit it.

Otoh no one has ever produced a physical rationale with which to justify
it. Of course many don't think that is scientifically necessary, but I
do.


>
>Although Einstein wrote a lot of stuff, much of which is preserved, it
>is interesting that the "biggest blunder" comment is known ONLY through
>an anecdote from Gamow (who was something of a jokester and certainly
>not above embellishing things), although if it really was the biggest
>blunder one could think Einstein would have mentioned this more often.
>
>His later rejection of the cosmological constant is often cited as a
>reason to discredit it. However, most of the stuff Einstein worked on
>in his last 40 years essentially led to nothing. If he was wrong there,
>he could be wrong about the cosmological constant.

I don't think he was wrong there. He was working on teleparallelism.
Einstein used deep philosophical insight as the basis for his theories,
and the fact that most physicists did not follow him is no evidence that
he was wrong. It is not possible physically or philosophically to
justify the affine connection. Unfortunately Einstein had not got
quantum theory on board, and quantum electrodynamics was only fully
developed at the end of his life - and even then with horrible
difficulties. Unification has to be with qed, not with classical
electrodynamics as he was attempting. Nonetheless, if teleparallelism is
an important part of unification, then I think Einstein's work and
insights in his later life will be vindicated, and those who said he
lost the thread will be shown up for what they were.


>
>> If I recall correctly, in the pre- BB model, the CC had to be "just
>> right" to prevent collapse or expansion, a bit like baby's porridge in
>> the story of Goldilocks and the three bears.
>
>Yes. However, with the Einstein-de Sitter model, it also has to be
>"just right". A model near, but not exactly, at this point will move
>farther and farther away, just as is the case with the Einstein
>universe. Of course, the Einstein universe is ruled out because it is
>static, but this additional argument of instability was, as far as I
>know, never used to rule out the Einstein-de Sitter model.

I think Ned Wright mentions it on his cosmology tutorial. But I am not
sure that, on its own, this is enough to rule out the Einstein-de Sitter
model. To do that there has to be an additional causality assumption,
that the past causes the future. This is rather difficult for a space-
time structure which presents all four dimensions together, and also
rather difficult given that we haven't integrated it with quantum
indeterminism.


>
>> Now that it has been
>> resurrected, it seems the porridge has to be "just right" again, this
>> time to produce a credible level of accelerating expansion.


Not really. The level of accelerating expansion is experimentally
determined. Of course one does require a universe of a particular age,
both for BBN to work correctly, and to avoid serious timescale problems.
The rate of expansion does have to be right to create a universe in
which we can exist and observe it.

John (Liberty) Bell

unread,
Oct 30, 2006, 3:38:04 AM10/30/06
to
Phillip Helbig---remove CLOTHES to reply wrote:
> In article <mt2.0-14405...@hercules.herts.ac.uk>, "John
> (Liberty) Bell" <john...@accelerators.co.uk> writes:

[re. cosmological constant

> Actually, if it is too large it will expand too quickly, as you say,
> but also if it is too small the universe will just recollapse as if it
> were 0 or negative. It has to be near a critical value for this
> circumnavigation to take place.

Yes, that is precisely right.

> Discredited? Not really. Einstein thought it unnecessary after it was
> discovered that the universe is not static, and perhaps wished he could
> have predicted the expansion (or contraction).

According to MTW he did, originally, and was so distressed by this that
the CC was then invented to prevent this happening. Referring to p 410
- 411 of that weighty tome, GRAVITATION, MTW list reasons why possible
mathematical adaptations of the theory were inappropriate, using words
like "anguish" and "damage", and concluding: "Nevertheless, these
consequences were less painful to Einstein than a dynamic universe".

Thus the originally derived equation G = 8piT
was changed to G + lambda.g = 8piT
(and then promptly changed back again to its original form, following
Hubble's discovery.)

Now that we are back to G + lambda.g = 8piT again, I would be
extremely grateful if someone could clarify, in simple language,
precisely what that now means physically, in terms of Hubble's
constant.

Is the Hubble constant we observe locally now, the same as the Hubble's
constant that an observer living at z = 0.5 should observe locally, and
the same as the Hubble's constant that an observer living at z =1
should observe locally, etc? Or smaller ? Or larger?

> Although Einstein wrote a lot of stuff, much of which is preserved, it
> is interesting that the "biggest blunder" comment is known ONLY through
> an anecdote from Gamow (who was something of a jokester and certainly
> not above embellishing things),

Certainly, that is the only reference provided by MTW too, whereas most
of his quotes are traced back to Schlipp.

> although if it really was the biggest
> blunder one could think Einstein would have mentioned this more often.

On the other hand, one could equally well think he would want to keep
this embarassment swept under the carpet, in public. (Depends, I guess,
on whether he was a riot at parties for poking fun at himself, or not)

> > Now that it has been
> > resurrected, it seems the porridge has to be "just right" again, this
> > time to produce a credible level of accelerating expansion.
>

> True to some extent, but there is an obvious selection effect here: too
> small (i.e. negative) and the universe recollapses quickly, too large
> and it expands before structure can form. So this might be just a case
> of the trivial weak anthropic principle at work.

Alternatively, you could justify its preservation / resurrection in
cosmology, as
2) an 'adjustment' to accommodate quantum vacuum energy effects as
described for motivation on p 411 of MTW,
3) a first order correction to allow EFE to more closely approximate to
a yet to be published unified field equation.

Aesthetically speaking I would prefer option 3, since it holds out the
hope that reality can be finally understood via derivation from a
rigorous field equation, instead of reverse engineered from
accumulative astronomical data. However, that is, presumably, a matter
of personal taste.

John Bell

Oh No

unread,
Oct 30, 2006, 9:42:15 AM10/30/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
>
>Now that we are back to G + lambda.g = 8piT again, I would be
>extremely grateful if someone could clarify, in simple language,
>precisely what that now means physically, in terms of Hubble's
>constant.
>
>Is the Hubble constant we observe locally now, the same as the Hubble's
>constant that an observer living at z = 0.5 should observe locally, and
>the same as the Hubble's constant that an observer living at z =1
>should observe locally, etc? Or smaller ? Or larger?

No. Hubble's constant is a measure of the current rate of expansion. It
may be taken as constant in our era, but is not constant over the life
of the universe. A closed, no lambda model has Hubble's constant
decreasing monotonically from its original value at the big bang, and
turning negative at maximum expansion leading to a big crunch. The
currently favoured concordance (Lambda~0.7) model has Hubbles constant
initially decreasing, but then starting to increase (accelerating
expansion).


>
>Alternatively, you could justify its preservation / resurrection in
>cosmology, as
>2) an 'adjustment' to accommodate quantum vacuum energy effects as
>described for motivation on p 411 of MTW,

Yes, but this is poor motivation. These quantum vacuum effects are shown
as unconnected diagrams, which effectively means they do not interact
electrodynamically with the observable universe and play no part in
predictions in qed. If they did have an effect, current theory would
make it infinite, and any sort of calculation correcting that by
introducing a cut-off still gives a figure for energy which is massively
too large.

>3) a first order correction to allow EFE to more closely approximate to
>a yet to be published unified field equation.
>
>Aesthetically speaking I would prefer option 3, since it holds out the
>hope that reality can be finally understood via derivation from a
>rigorous field equation, instead of reverse engineered from
>accumulative astronomical data. However, that is, presumably, a matter
>of personal taste.

And yet you have been fairly hostile to a model which does make a minor
alteration to gtr, not actually to the EFE itself (except in so far as
astronomical data becomes consistent with Lambda=0) but to the
interpretation of cosmological redshift from which cosmological
parameters are calculated.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 31, 2006, 4:19:21 AM10/31/06
to
In article <mt2.0-21455...@hercules.herts.ac.uk>, "John
(Liberty) Bell" <john...@accelerators.co.uk> writes:

> > Discredited? Not really. Einstein thought it unnecessary after it was
> > discovered that the universe is not static, and perhaps wished he could
> > have predicted the expansion (or contraction).
>
> According to MTW he did, originally, and was so distressed by this that
> the CC was then invented to prevent this happening. Referring to p 410
> - 411 of that weighty tome, GRAVITATION, MTW list reasons why possible
> mathematical adaptations of the theory were inappropriate, using words
> like "anguish" and "damage", and concluding: "Nevertheless, these
> consequences were less painful to Einstein than a dynamic universe".

Yes, but at the time it was believed that observations indicated that
the universe was static. It was not known that what we now think of as
local stars are not representative of the global kinematics of the
universe. Firm evidence that extragalactic systems existed was still
years in the future.

> Thus the originally derived equation G = 8piT
> was changed to G + lambda.g = 8piT
> (and then promptly changed back again to its original form, following
> Hubble's discovery.)
>
> Now that we are back to G + lambda.g = 8piT again, I would be
> extremely grateful if someone could clarify, in simple language,
> precisely what that now means physically, in terms of Hubble's
> constant.

This is a strange way to phrase the question. Hubble's constant is the
rate of change of the scale factor of the universe divided by the scale
factor, i.e. a sort of normalised expansion speed. The cosmological
constant is CONSTANT because it is constant in time (it is trivially
constant in space at a given time in a homogeneous universe) whereas the
other component determining the dynamics of the universe, the density of
matter or radiation, decreases with expansion and thus with time
(radiation decreasing more quickly, so that it is negligible today but
was dominant earlier on). For practical reasons, however, one often
likes quantities which are more directly observable, and a common one is
the cosmological constant divided by three times the square of the
Hubble constant. (This was even more important before the Hubble
constant was known accurately; since it often cancels out, one can
actually measure other important quantities without knowing it or at
least express them in terms of the Hubble constant.)

> Is the Hubble constant we observe locally now, the same as the Hubble's
> constant that an observer living at z = 0.5 should observe locally, and
> the same as the Hubble's constant that an observer living at z =1
> should observe locally, etc? Or smaller ? Or larger?

No. In general, all cosmological parameters---Hubble's constant,
density parameter etc---change with time. (There are special-case
universes where they don't.) The "physical" Hubble constant, nomen est
omen, does not, but the "observable" one does since it is defined in
terms of the Hubble constant. In general, the Hubble constant (the
constant HERE refers to the fact that it is constant everywhere) changes
with time, whatever the value of the cosmological constant is.

Phillip Helbig---remove CLOTHES to reply

unread,
Oct 31, 2006, 4:18:38 AM10/31/06
to
In article <mt2.0-21455...@hercules.herts.ac.uk>, Oh No
<No...@charlesfrancis.wanadoo.co.uk> writes:

> >> I am no expert on the CC, since it had already been discredited,
> >
> >Discredited? Not really. Einstein thought it unnecessary after it was
> >discovered that the universe is not static, and perhaps wished he could
> >have predicted the expansion (or contraction). Mathematically or
> >physically, though, Einstein did nothing to discredit it.
>
> Otoh no one has ever produced a physical rationale with which to justify
> it. Of course many don't think that is scientifically necessary, but I
> do.

In particle physics, standard wisdom is that if nature has a degree of
freedom, she uses it. If you claim otherwise, you have found a new
conservation law, quantum number, symmetry or whatever and obviously the
burden of proof is on you to back that up. Applying the same reasoning
would make the cosmological constant the default and place the burden of
proof on those who claim that it is exactly zero.

John (Liberty) Bell

unread,
Oct 31, 2006, 4:20:55 AM10/31/06
to
Oh No wrote:
> Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
> >
> >Now that we are back to G + lambda.g = 8piT again, I would be
> >extremely grateful if someone could clarify, in simple language,
> >precisely what that now means physically, in terms of Hubble's
> >constant.
> >
> >Is the Hubble constant we observe locally now, the same as the Hubble's
> >constant that an observer living at z = 0.5 should observe locally, and
> >the same as the Hubble's constant that an observer living at z =1
> >should observe locally, etc? Or smaller ? Or larger?
>
> No. Hubble's constant is a measure of the current rate of expansion.

Yes, I assumed everybody knew that, don't they?

> It
> may be taken as constant in our era, but is not constant over the life
> of the universe.

> A closed, no lambda model has Hubble's constant
> decreasing monotonically from its original value at the big bang, and
> turning negative at maximum expansion leading to a big crunch.

Yes, that is obvious too (isn't it?), given a basic understanding of
the classical closed universe version of GR, and my prior comments on
that subject?

> The
> currently favoured concordance (Lambda~0.7) model has Hubbles constant
> initially decreasing,

Thank you. But by how much, and when?

> but then starting to increase (accelerating
> expansion).

Thank you again, but by how much, and when?

(No further comments, as the rest of this response doesn't seem to make
too much sense)

JB

John (Liberty) Bell

unread,
Oct 31, 2006, 4:19:59 AM10/31/06
to
rlold...@amherst.edu wrote:

> Of more possible interest is the recent paper with M.S. Turner as one
> of the authors in ApJ, 649, 563-569, 2006.

Thanks. This paper has gone a long way towards addressing my request
for additional information on CC predicted changes in acceleration, and
on concordance (or otherwise) with firm astrronomical data.

John

John (Liberty) Bell

unread,
Oct 31, 2006, 6:15:41 AM10/31/06
to
Phillip Helbig---remove CLOTHES to reply wrote:
> In article <mt2.0-21455...@hercules.herts.ac.uk>, "John
> (Liberty) Bell" <john...@accelerators.co.uk> writes:
>
> > > Discredited? Not really. Einstein thought it unnecessary after it was
> > > discovered that the universe is not static, and perhaps wished he could
> > > have predicted the expansion (or contraction).
> >
> > According to MTW he did, originally, and was so distressed by this that
> > the CC was then invented to prevent this happening. Referring to p 410
> > - 411 of that weighty tome, GRAVITATION, MTW list reasons why possible
> > mathematical adaptations of the theory were inappropriate, using words
> > like "anguish" and "damage", and concluding: "Nevertheless, these
> > consequences were less painful to Einstein than a dynamic universe".
>
> Yes, but at the time it was believed that observations indicated that
> the universe was static.

Again, I totally agree. I have already brought this up in criticism of
Oh No's assertion that this was down to prejudice. It was not. It was
down to the natural desire to provide a field equation that was
consistent with scientific knowledge of the time. The point is, the
solution in its pure form did predict such expansion, and was
consequently tinkered with (in somewhat dubious ways) to prevent this.
If Einstein had not done that, he may have been credited with genuinely
predicting the big bang. On the other hand, it may have been more
likely that the thesis would then have been rejected by the referees
(of that time).

> > Thus the originally derived equation G = 8piT
> > was changed to G + lambda.g = 8piT
> > (and then promptly changed back again to its original form, following
> > Hubble's discovery.)
> >

> > Now that we are back to G + lambda.g = 8piT again, I would be
> > extremely grateful if someone could clarify, in simple language,
> > precisely what that now means physically, in terms of Hubble's
> > constant.
>

> This is a strange way to phrase the question. Hubble's constant is the
> rate of change of the scale factor of the universe divided by the scale
> factor,

I hope you don't mind me saying your above definition is as clear as
mud to me.

In my book, Ho is the (almost locally) observed mean rate of separation
of galaxies divided by that spatial separation of galaxies. Hence the
subsequent sentence:

> > Is the Hubble constant we observe locally now, the same as the Hubble's
> > constant that an observer living at z = 0.5 should observe locally, and
> > the same as the Hubble's constant that an observer living at z =1
> > should observe locally, etc? Or smaller ? Or larger?

Obviously, in classical big bang theory, Ho constantly decreases with
time. My original question was, therefore, is the currently preferred
model 'just enough' to prevent that decrease. From Rod's reference, it
is clear the answer is it is more than enough from z = ~ 0.5 - 0.7 to
now (i.e. Ho increases with increasing time), and less than enough
beforehand (i.e. Ho then decreases with increasing time). Whether the
currently preferred model's predictions are supported by hard
astronomical evidence is a possibility that is questioned and analysed
in that paper.

John

Oh No

unread,
Oct 31, 2006, 10:43:27 AM10/31/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
>Oh No wrote:
>> Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
>> >
>> >Now that we are back to G + lambda.g = 8piT again, I would be
>> >extremely grateful if someone could clarify, in simple language,
>> >precisely what that now means physically, in terms of Hubble's
>> >constant.
>> >
>> >Is the Hubble constant we observe locally now, the same as the Hubble's
>> >constant that an observer living at z = 0.5 should observe locally, and
>> >the same as the Hubble's constant that an observer living at z =1
>> >should observe locally, etc? Or smaller ? Or larger?
>>
>> No. Hubble's constant is a measure of the current rate of expansion.
>
>Yes, I assumed everybody knew that, don't they?
>
>> It
>> may be taken as constant in our era, but is not constant over the life
>> of the universe.
>
>> A closed, no lambda model has Hubble's constant
>> decreasing monotonically from its original value at the big bang, and
>> turning negative at maximum expansion leading to a big crunch.
>
>Yes, that is obvious too (isn't it?), given a basic understanding of
>the classical closed universe version of GR, and my prior comments on
>that subject?

You asked the question. I only answered the question you asked. Perhaps
you meant something else. The cosmological constant, Lambda, is taken to
be constant in most models, but they try varying that too in a theory
called quintescence. They have tried analysing the super nova data to
test for this, but so far results are consistent with constant Lambda.


>
>> The
>> currently favoured concordance (Lambda~0.7) model has Hubbles constant
>> initially decreasing,
>
>Thank you. But by how much, and when?

>
>> but then starting to increase (accelerating
>> expansion).
>
>Thank you again, but by how much, and when?
>

The equations are not trivial to solve. I have only seen analysis of the
general behaviour, not a quantitative solution which I think would need
a computer. One measure of the effect is to note that if Hubble's
constant is 80 km/s/Mpc, Hubble time, 1/H0 is around 10 Gyrs, which
would be the age of the universe assuming constant expansion. In a flat
no lambda universe the age of the universe is 2/3 Hubble time, and
correspondingly less in a closed no lambda universe. Such a universe
would give us a definite timescale problem for high red shift galaxies.
Accelerating expansion for a flat universe with Lamda=0.7 gives us an
age around 14Gyrs.

John (Liberty) Bell

unread,
Oct 31, 2006, 10:43:50 AM10/31/06
to
John Bell wrote:

> Obviously, in classical big bang theory, Ho constantly decreases with
> time. My original question was, therefore, is the currently preferred
> model 'just enough' to prevent that decrease.

> From Rob's reference, it


> is clear the answer is it is more than enough from z = ~ 0.5 - 0.7 to
> now (i.e. Ho increases with increasing time), and less than enough
> beforehand (i.e. Ho then decreases with increasing time). Whether the
> currently preferred model's predictions are supported by hard
> astronomical evidence is a possibility that is questioned and analysed
> in that paper.

Unfortunately, closer examination of
http://www.journals.uchicago.edu/ApJ/journal/issues/ApJ/v649n2/64531/64531.html
indicates that this is far less clear cut than I originally thought,
not least because such conclusions depend strongly on the assumption of
a flat universe .The conclusions therefore have no genuine generality
for examining differing conceptual models of the universe.

This also relates back to the reason I asked the question the way that
I did, since I am now again left somewhat confused by what we actually
mean by accelerating expansion. To explain my conceptual problem in a
model independent form, let me proceed purely hypothetically as
follows:

Imagine two typical domains (say early galaxies) flying apart at speed
s in the early universe. If that speed remains approximately constant,
Ho will continue to decrease with increasing time, since separation is
increasing. If, on the other hand, Ho were approximately constant with
time, these domains would have to be accelerating apart at a
substantial clip.

So, what do we actually mean by accelerating expansion? Does this mean:
(a) s is increasing with time
(b) s is increasing so rapidly with time that Ho is no longer
decreasing with time
(c) s is increasing so rapidly with time that Ho is also increasing
with time.

It looks like the answer is probably (a), which still leaves me at
least temporarily in the dark over the answer to my original question.

Can anybody bale me out here?

John

Oh No

unread,
Nov 1, 2006, 5:08:17 AM11/1/06
to
Thus spake Phillip Helbig---remove CLOTHES to reply <hel...@astro.multiC
LOTHESvax.de>

>In article <mt2.0-21455...@hercules.herts.ac.uk>, Oh No
><No...@charlesfrancis.wanadoo.co.uk> writes:
>
>> >> I am no expert on the CC, since it had already been discredited,
>> >
>> >Discredited? Not really. Einstein thought it unnecessary after it was
>> >discovered that the universe is not static, and perhaps wished he could
>> >have predicted the expansion (or contraction). Mathematically or
>> >physically, though, Einstein did nothing to discredit it.
>>
>> Otoh no one has ever produced a physical rationale with which to justify
>> it. Of course many don't think that is scientifically necessary, but I
>> do.
>
>In particle physics, standard wisdom is that if nature has a degree of
>freedom, she uses it.

Of course that gives much motivation for string theory. But it's not
empirical science.

> If you claim otherwise, you have found a new
>conservation law, quantum number, symmetry or whatever and obviously the
>burden of proof is on you to back that up.

Actually it's dead easy. Introduce a theory with no physical effects.
You have found a symmetry law with respect to each variable of that
theory (lol). Make sure the theory is both mathematically elegant and
extremely difficult and make wild claims about it. Oh, that is string
theory. Sorry, it's been done. :-)

>Applying the same reasoning
>would make the cosmological constant the default and place the burden of
>proof on those who claim that it is exactly zero.

It may not be possible to prove that it is exactly zero, certainly not
possible to prove it empirically. Attempting to prove it theoretically
is jumping the gun. First one needs to establish a grand unified theory,
then discover whether it is still a degree of freedom.

Oh No

unread,
Nov 1, 2006, 5:10:20 AM11/1/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>

As Philip said, Hubbles constant is the rate of change of the scale
factor divided by the scale factor. You may take the scale factor to be
a measure of separation between galaxies (or rather clusters) if you
like over short cosmological distances. The distance between two
galaxies is then r*a(t) where r is a constant for galaxies moving with
the cosmic fluid, and a(t) is the scale factor. Then, for small r, the
speed of separation is

s = r * adot(t)

Hubbles constant is H(t) = adot(t)/a(t)

>If, on the other hand, Ho were approximately constant with
>time, these domains would have to be accelerating apart at a
>substantial clip.

H0 = H(t0) where t0 is current time. I.e. H0 is Hubble now. you mean
H(t).

>So, what do we actually mean by accelerating expansion? Does this mean:
>(a) s is increasing with time
>(b) s is increasing so rapidly with time that Ho is no longer
>decreasing with time
>(c) s is increasing so rapidly with time that Ho is also increasing
>with time.
>
>It looks like the answer is probably (a), which still leaves me at
>least temporarily in the dark over the answer to my original question.
>
>Can anybody bale me out here?
>

It means adot is increasing with time. So that does mean s is increasing
with time.

The acceleration (or deceleration) parameter is

q = - adotdot * a / adot^2

So in a universe with accelerating expansion q is negative. To take this
much further you have to use Friedman's equation. Then it is found

q = Omega/2 - Omega_Lambda.

We also have

Omega + Omega_k + Omega Lambda = 1

According to observation Omega_k ~ 0 (a flat universe), Omega ~ 0.3,
Omega_Lambda ~ 0.7, so we are well in the regime of accelerating
expansion.

To answer your question in something like the form you put it, use

H(t) = adot(t)/a(t)

Hdot = adotdot/a - adot^2/a^2

= -q H^2 - H^2

= -H^2(q + 1)

So H is decreasing for the observed values.

For H to be constant you need q = - 1. In a flat universe this means
Omega = 0, zero density and Omega_Lambda=1. If density is greater than
0, you can only get b) or c) with Omega_Lambda > 1+Omega/2, which means
Omega_k < 1. That means a closed universe with positive curvature.

(warning, I do not guarantee that I have not made silly mistakes, but
you should be able to check).

John (Liberty) Bell

unread,
Nov 1, 2006, 5:07:07 AM11/1/06
to
Oh No wrote:
> Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
> >Oh No wrote:

> >> The
> >> currently favoured concordance (Lambda~0.7) model has Hubbles constant
> >> initially decreasing,

> >Thank you. But by how much, and when?

> >> but then starting to increase (accelerating
> >> expansion).

> >Thank you again, but by how much, and when?

> The equations are not trivial to solve. I have only seen analysis of the
> general behaviour, not a quantitative solution which I think would need
> a computer. One measure of the effect is to note that if Hubble's
> constant is 80 km/s/Mpc, Hubble time, 1/H0 is around 10 Gyrs, which
> would be the age of the universe assuming constant expansion.

> Accelerating expansion for a flat universe with Lamda=0.7 gives us an
> age around 14Gyrs.

I beg to differ. That would give a divergence of ~40% in the age of the
universe. The default setting of the UCLA calculator gives

Ho=71 km/sec/Mpc
OmegaM=0.27
OmegaVac=0.73 (=Lambda=Lamda?)
Age=13.67 Gyr (flat universe)

However, Ho=71 km/sec/Mpc=7.1x10^6 / 3.09x10^24 sec^ -1
Hence Ho^ -1 = (3.09/7.1)x10 ^ -18 sec = (30.9 / 3.115x7.1) x 10^10
years =13.97Gyr

Consequently, taking the word "flat" literally, gives the correct (CC)
value of the age of the universe, via straight line extrapolation, to
within ~ 2% of the UCLA calculator result.

This calculation, at least, is trivial.

Therefore, as amplified in my posting of Oct 31 2006 3:43 pm (GMT), I
now also have some reservations about your statement that Ho is
actually increasing at present, with increasing time.

JB

geo...@briar.demon.co.uk

unread,
Nov 1, 2006, 5:06:40 AM11/1/06
to
John (Liberty) Bell wrote:

<snip>

> Imagine two typical domains (say early galaxies) flying apart at speed
> s in the early universe. If that speed remains approximately constant,
> Ho will continue to decrease with increasing time, since separation is

> increasing. ...


>
> So, what do we actually mean by accelerating expansion? Does this mean:
> (a) s is increasing with time
> (b) s is increasing so rapidly with time that Ho is no longer
> decreasing with time
> (c) s is increasing so rapidly with time that Ho is also increasing
> with time.
>
> It looks like the answer is probably (a), which still leaves me at
> least temporarily in the dark over the answer to my original question.
>
> Can anybody bale me out here?

The last I saw, it went like this. the early universe they
would have been moving apart fast but the speed would
be reducing. As they got farther apart, the 'force of gravity'
becomes smaller while the cosmological constant remains
constant and eventually the latter starts to dominate. When
they were roughly equal, the speed s would have been
constant but after that it starts increasing again, at first
slowly but getting faster with time until in the far future
it becomes exponential. Your speed 's' therefore has a
minimum value which I think was around 6 billion years
ago.

HTH
George

John (Liberty) Bell

unread,
Nov 1, 2006, 5:08:31 AM11/1/06
to
Oh No wrote:
> Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
> >Oh No wrote:

> >> The
> >> currently favoured concordance (Lambda~0.7) model has Hubbles constant
> >> initially decreasing,

> >Thank you. But by how much, and when?

> >> but then starting to increase (accelerating
> >> expansion).

> >Thank you again, but by how much, and when?

> The equations are not trivial to solve. I have only seen analysis of the
> general behaviour, not a quantitative solution which I think would need
> a computer.

I don't see why that is a problem.
It is obvious from our mode of communication, that we all have (at
least) one.

The only problem I have with working out these answers for myself, is
that, as is usual with GR, the authors do not disclose the units within
which the acceleration/deceleration parameter q is measured. If any of
you can fill me in on that technical detail, I can probably do the
rest.

John (Liberty) Bell

unread,
Nov 1, 2006, 5:54:54 AM11/1/06
to

I agree with you in every respect, including your figure, which is
consistent with a transition epoch between z=0.4 and z=0.7. However, as
far as I can tell, this does not adequately address the question of how
Ho varies with time. Hopefully, a pertinent definition of units, in
response to my posting of Wed, Nov 1 2006 10:08 am (GMT), will resolve
that question shortly.

Regards

John

Oh No

unread,
Nov 1, 2006, 9:13:47 AM11/1/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
>Oh No wrote:

>
>> The equations are not trivial to solve. I have only seen analysis of the
>> general behaviour, not a quantitative solution which I think would need

>> a computer. One measure of the effect is to note that if Hubble's
>> constant is 80 km/s/Mpc, Hubble time, 1/H0 is around 10 Gyrs, which
>> would be the age of the universe assuming constant expansion.
>> Accelerating expansion for a flat universe with Lamda=0.7 gives us an
>> age around 14Gyrs.
>
>I beg to differ.

Not in that you need a computer, it would seem.

> That would give a divergence of ~40% in the age of the
>universe. The default setting of the UCLA calculator gives
>
>Ho=71 km/sec/Mpc
>OmegaM=0.27
>OmegaVac=0.73 (=Lambda=Lamda?)

yes. its more correctly spelt with a b, I think

>Age=13.67 Gyr (flat universe)

That's what I said, ~14Gyr


>
>However, Ho=71 km/sec/Mpc=7.1x10^6 / 3.09x10^24 sec^ -1
>Hence Ho^ -1 = (3.09/7.1)x10 ^ -18 sec = (30.9 / 3.115x7.1) x 10^10
>years =13.97Gyr

Ok, I get 13.8, but no matter.


>
>Consequently, taking the word "flat" literally, gives the correct (CC)
>value of the age of the universe, via straight line extrapolation, to
>within ~ 2% of the UCLA calculator result.

Of course that is a coincidence, and depends on the values of Omega and
Omega_Lambda, and the current age.


>
>This calculation, at least, is trivial.
>
>Therefore, as amplified in my posting of Oct 31 2006 3:43 pm (GMT), I
>now also have some reservations about your statement that Ho is
>actually increasing at present, with increasing time.

If I said that it was a slip of the keyboard. As I have just posted, it
is not true.

John (Liberty) Bell

unread,
Nov 1, 2006, 1:38:14 PM11/1/06
to
Oh No wrote:
> Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>

> >If, on the other hand, Ho were approximately constant with


> >time, these domains would have to be accelerating apart at a
> >substantial clip.
>
> H0 = H(t0) where t0 is current time. I.e. H0 is Hubble now. you mean
> H(t).

You are correct, relative to us. Nevertheless, if there were an
intelligence located at t, that intelligence would define that H as Ho,
relative to them. (I can see how such usage could lead to confusion,
though)

The rest of your response would require further thought, not least
because it is (necessarily) expressed in ascii, whereas (I believe) I
already have adequate relevant formulae expressed more elegantly and
comprehensibly, using the full flexibility of HTML (including gifs and
jpgs)

I am, therefore, only awaiting a response to my posting of 10:08 am
(GMT), for a clear and unambiguous definition of employed units.

Regards

JB

Phillip Helbig---remove CLOTHES to reply

unread,
Nov 1, 2006, 2:51:26 PM11/1/06
to
In article <mt2.0-18493...@hercules.herts.ac.uk>, "John
(Liberty) Bell" <john...@accelerators.co.uk> writes:

> > The
> > currently favoured concordance (Lambda~0.7) model has Hubbles constant
> > initially decreasing,
>
> Thank you. But by how much, and when?
>
> > but then starting to increase (accelerating
> > expansion).
>
> Thank you again, but by how much, and when?

How to answer this in words? It's something which needs to be
calculated. Or, check out this web page:

http://www.jb.man.ac.uk/~jpl/cosmo/friedman.html

Phillip Helbig---remove CLOTHES to reply

unread,
Nov 1, 2006, 2:52:12 PM11/1/06
to
In article <mt2.0-4934...@hercules.herts.ac.uk>, Oh No
<No...@charlesfrancis.wanadoo.co.uk> writes:

> You asked the question. I only answered the question you asked. Perhaps
> you meant something else. The cosmological constant, Lambda, is taken to
> be constant in most models, but they try varying that too in a theory
> called quintescence. They have tried analysing the super nova data to
> test for this, but so far results are consistent with constant Lambda.

If it varies, it's not a cosmological CONSTANT. Yes, a "variable
lambda" or something similar with many other names has been tried, but
there are no observations for which that is a better fit.

Oh No

unread,
Nov 1, 2006, 3:33:17 PM11/1/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
>Oh No wrote:
>I am, therefore, only awaiting a response to my posting of 10:08 am
>(GMT), for a clear and unambiguous definition of employed units.
>

I take it you mean this.

Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>

>The only problem I have with working out these answers for myself, is
>that, as is usual with GR, the authors do not disclose the units within
>which the acceleration/deceleration parameter q is measured. If any of
>you can fill me in on that technical detail, I can probably do the
>rest.


Units are not employed as such. Mathematicians generally use natural
units in which c=1. q is defined as

q = - adotdot * a / adot^2

so you can work out pretty quickly that it is dimensionless, as is Omega
(being the ratio of actual density to critical density for closure in a
Lambda = 0 cosmology).

It is not normal to quote values for q. Generally values for Omega and
Omega_Lambda are calculated. Then you can use

q = Omega/2 - Omega_Lambda.

to find q if you want to. See my other post.

[Mod. note: in the Lambda=0 cosmologies commonly used in earlier
decades, it was quite usual to quote q_0 = 0 or 0.5 in papers to
indicate one's choice of geometry for the universe. Like the choice of
H_0, it used to vary according to the prejudices and background of the
writer -- mjh]

John (Liberty) Bell

unread,
Nov 2, 2006, 8:10:33 AM11/2/06
to
Oh No wrote:

> In a flat universe this means
> Omega = 0, zero density and Omega_Lambda=1. If density is greater than
> 0, you can only get b) or c) with Omega_Lambda > 1+Omega/2, which means
> Omega_k < 1. That means a closed universe with positive curvature.

Since the question of curvature and, particularly, positive curvature,
has now arisen again, I would also now appreciate a clear
definition of the following:

A) Omega_k
B) _Positive_ curvature. Is that like the skin of an inflated balloon
viewed from the outside (i.e. convex), or like the skin of an inflated
balloon viewed from the inside (i.e. concave)?

Thanks

JB

Oh No

unread,
Nov 2, 2006, 10:46:32 AM11/2/06
to
Thus spake "John (Liberty) Bell" <john...@accelerators.co.uk>
>Oh No wrote:
>
>> In a flat universe this means
>> Omega = 0, zero density and Omega_Lambda=1. If density is greater than
>> 0, you can only get b) or c) with Omega_Lambda > 1+Omega/2, which means
>> Omega_k < 1. That means a closed universe with positive curvature.
>
>Since the question of curvature and, particularly, positive curvature,
>has now arisen again, I would also now appreciate a clear
>definition of the following:
>
>A) Omega_k

Omega_k is defined as -k /a^2/H^2

where k is curvature, a is the scale factor, and H is Hubble's constant.

>B) _Positive_ curvature. Is that like the skin of an inflated balloon
>viewed from the outside (i.e. convex), or like the skin of an inflated
>balloon viewed from the inside (i.e. concave)?

Yes to both questions. Negative curvature is like a saddle or the mouth
of a trumpet. It's a bit better to think of it in terms of its own
geometry rather than an embedding in higher dimensional space though.
Positive curvature means the circumference of a circle is less than 2pir
or the angles of a triangle are greater than 180deg.

Chalky

unread,
Nov 2, 2006, 1:40:14 PM11/2/06
to
Oh No wrote:

> The acceleration (or deceleration) parameter is
>

> q = - adotdot * a / adot^2

Aaargh! I thus get an infinite acceleration parameter for a static
universe. Is that correct?

Chalky

Oh No

unread,
Nov 2, 2006, 4:56:20 PM11/2/06
to
Thus spake Chalky <chalk...@bleachboys.co.uk>

lol

In case anyone reading doesn't get the joke, chalky has hit on an
instance of 0/0 which is undefined and gives dodgy results. To do it
properly you would have to find the limit of q as the universe tends to
static. The result would be

q = Omega/2 - Omega_Lambda.

which is quite respectable.

Steve Willner

unread,
Nov 2, 2006, 4:56:42 PM11/2/06
to
In article <mt2.0-4404...@hercules.herts.ac.uk>,
George Dishman <geo...@briar.demon.co.uk> writes:
> If N is the number of galaxies within a cube of
> side length l:
>
> dN/<N> = 0.5 +/- 0.1 at a box width l = 30h^-1 Mpc
>
> where dN is the standard deviation of N and <N> is
> the mean of N.
>
> Saunders et al 1991 and Efstathiou 1991

> I have sometimes wondered how dN/<N> would vary
> as a function of l and I guess I should know but
> it's too long since I did any statistics.

I've recently had to look into this subject. What you are looking
for is related to the "correlation function," which is the _excess_
probability of finding a galaxy at a specified distance from a known
galaxy. The correlation function is positive at small distances --
galaxies are clustered -- and negative at large distances in order to
keep its integral finite.

There are two ways of determining the correlation function: use large
surveys such as Sloan or 2dF to measure it directly, or use CDM
models to evolve the known CMB fluctuations to the epoch of interest.
Neither method is entirely satisfactory. For small distances,
though, it probably suffices to use a power law
cf \propto (d/7 Mpc)^-1.8 .
Papers by Zehavi et al. (Sloan) or Norberg et al. (2dF) will have
more details.

--
Steve Willner Phone 617-495-7123 swil...@cfa.harvard.edu
Cambridge, MA 02138 USA
(Please email your reply if you want to be sure I see it; include a
valid Reply-To address to receive an acknowledgement. Commercial
email may be sent to your ISP.)

Phillip Helbig---remove CLOTHES to reply

unread,
Nov 3, 2006, 3:37:21 AM11/3/06
to
In article <mt2.0-27668...@hercules.herts.ac.uk>, Oh No
<No...@charlesfrancis.wanadoo.co.uk> writes:

> Units are not employed as such. Mathematicians generally use natural
> units in which c=1. q is defined as
>
> q = - adotdot * a / adot^2
>
> so you can work out pretty quickly that it is dimensionless, as is Omega
> (being the ratio of actual density to critical density for closure in a
> Lambda = 0 cosmology).
>
> It is not normal to quote values for q. Generally values for Omega and
> Omega_Lambda are calculated. Then you can use
>
> q = Omega/2 - Omega_Lambda.
>
> to find q if you want to. See my other post.
>
> [Mod. note: in the Lambda=0 cosmologies commonly used in earlier
> decades, it was quite usual to quote q_0 = 0 or 0.5 in papers to
> indicate one's choice of geometry for the universe. Like the choice of
> H_0, it used to vary according to the prejudices and background of the
> writer -- mjh]

The main reason that q was used in the past was that observational data
were only at relatively low redshift. In this region, q appears as the
first significant term in a series expansion (common then since few
closed solutions were known). These days, most tests measure some
combination of cosmological parameters, but it is rarely q. It is also
rarely Omega or lambda, but these have the advantage of being "physical"
rather than "derived". On the other hand, q tells one right away
whether the universe is currently accelerating or decelerating (but, by
itself, does NOT say anything about the geometry of the universe nor
whether or not it will expand forever). Of course, with closed
solutions and computers, no-one needs a series expansion anymore (which
doesn't make much sense at high redshift anyway).

Chalky

unread,
Nov 3, 2006, 3:37:28 AM11/3/06
to
Oh No wrote:

> The acceleration (or deceleration) parameter is
>

> q = - adotdot * a / adot^2

This also seems to give infinity (positive this time) at the point of
maximum expansion of a closed Friedmann universe

> q = Omega/2 - Omega_Lambda.

So which Omega is infinite at this point?

> We also have
> Omega + Omega_k + Omega Lambda = 1

This appears to make Omega_k infinite too.

Is that correct?


Chalky

geo...@briar.demon.co.uk

unread,
Nov 3, 2006, 5:04:48 AM11/3/06
to
Steve Willner wrote:
> In article <mt2.0-4404...@hercules.herts.ac.uk>,
> George Dishman <geo...@briar.demon.co.uk> writes:

....

> > I have sometimes wondered how dN/<N> would vary
> > as a function of l and I guess I should know but
> > it's too long since I did any statistics.
>
> I've recently had to look into this subject. What you are looking
> for is related to the "correlation function," which is the _excess_
> probability of finding a galaxy at a specified distance from a known
> galaxy. The correlation function is positive at small distances --
> galaxies are clustered -- and negative at large distances in order to
> keep its integral finite.
>
> There are two ways of determining the correlation function: use large
> surveys such as Sloan or 2dF to measure it directly, or use CDM
> models to evolve the known CMB fluctuations to the epoch of interest.
> Neither method is entirely satisfactory. For small distances,
> though, it probably suffices to use a power law
> cf \propto (d/7 Mpc)^-1.8 .
> Papers by Zehavi et al. (Sloan) or Norberg et al. (2dF) will have
> more details.

Thanks Steve, it is obviously much more complex
than I realised. Lots more for me to learn :-)

George

Chalky

unread,
Nov 3, 2006, 6:48:08 AM11/3/06
to
Oh No wrote:
> Thus spake Chalky <chalk...@bleachboys.co.uk>

> >Oh No wrote:
> >
> >> The acceleration (or deceleration) parameter is
> >>
> >> q = - adotdot * a / adot^2
> >
> >Aaargh! I thus get an infinite acceleration parameter for a static
> >universe. Is that correct?

> In case anyone reading doesn't get the joke, chalky has hit on an


> instance of 0/0 which is undefined and gives dodgy results.

You do not seem to have grasped the nature of the problem. Multiplying
by 0 can be coped with by any calculator. It is dividing by 0 which
gives an error overflow, indicative of infinity. Perhaps I can explain
the maths more clearly at the point of maximum expansion of a closed
universe. Here adotdot is negative, and a is positive, so the numerator
is negative and finite, giving positive q. Now, zero squared is zero,
so you end up with infinity again, but this time positive, when you
divide by zero.

Clearly, in this case, a small move either side of this mid way point
will still result in a negligible adot but a still substantial adotdot
* a. Consequently your defined acceleration parameter definitely tends
towards infinity as the maximum distance from the Big Bang is
approached.

Hence my already posted questions about your infinite Omegas in your
formulae.


Chalky

Oh No

unread,
Nov 3, 2006, 10:38:43 AM11/3/06
to
Thus spake Chalky <chalk...@bleachboys.co.uk>
>Oh No wrote:
>> Thus spake Chalky <chalk...@bleachboys.co.uk>
>> >Oh No wrote:
>> >
>> >> The acceleration (or deceleration) parameter is
>> >>
>> >> q = - adotdot * a / adot^2
>> >
>> >Aaargh! I thus get an infinite acceleration parameter for a static
>> >universe. Is that correct?
>
>> In case anyone reading doesn't get the joke, chalky has hit on an
>> instance of 0/0 which is undefined and gives dodgy results.
>
>You do not seem to have grasped the nature of the problem. Multiplying
>by 0 can be coped with by any calculator. It is dividing by 0 which
>gives an error overflow, indicative of infinity. Perhaps I can explain
>the maths more clearly at the point of maximum expansion of a closed
>universe.

That's a different question.

>Here adotdot is negative, and a is positive, so the numerator
>is negative and finite, giving positive q. Now, zero squared is zero,
>so you end up with infinity again, but this time positive, when you
>divide by zero.
>
>Clearly, in this case, a small move either side of this mid way point
>will still result in a negligible adot but a still substantial adotdot
>* a. Consequently your defined acceleration parameter definitely tends
>towards infinity as the maximum distance from the Big Bang is
>approached.
>
>Hence my already posted questions about your infinite Omegas in your
>formulae.
>

It would perhaps have helped if I had written

q0 = - adotdot(t0) * a(t0) / adot^2(t0)

The point of the acceleration parameter is that if you write a(t) as a
series expansion

a(t) = a(t0) + adot(t0)*(t - t0) + 1/2*adotdot(t0)*(t-t0)^2 + ...

then you have

a(t)/a(t0) = 1 + H0*(t - t0) - 1/2 * q0 * H0^2 *(t-t0)^2 + ...

Then at the point of maximum expansion it is defined when Hubble's
constant, H0, is zero. Clearly it would need to be infinite.

Also, by definition,

Omega = 8pi * G * ro_0 / 3 / H0^2 [1]
Omega_Lambda = Lambda / 3 / H0^2

where ro_0 is the current density of the universe.

So if we were trying to define these constants at the time of maximum
expansion, they too would be infinite. Fortunately we are not :-)

[1] Omega is defined as the ratio of actual density ro_0 of the universe
to the critical density for closure, ro_c. ro_c depends on the rate of
expansion. At maximum expansion critical density cannot be calculated
from the current rate of expansion, so it is natural that this should
return an infinite result.

It is loading more messages.
0 new messages