Entropy of early universe

57 views
Skip to first unread message

Alan Grayson

unread,
Sep 14, 2019, 9:12:34 AM9/14/19
to Everything List
If the early universe, say before the emergence of the CMBR, consisted of a random collection of electrons and photons, wouldn't this correspond to a high, not low entropy? Wouldn't it be analogous to gas with many possible states? Yet cosmologists seem hard pressed to explain an initial or early state assuming the entropy is low. AG

Jason Resch

unread,
Sep 14, 2019, 10:11:56 AM9/14/19
to Everything List
Lazer says the expansion of the universe creates an increased difference between the current entropy and the maximum possible entropy: https://www.informationphilosopher.com/solutions/scientists/layzer/growth_of_order/

Thereby introducing room for entropy to increase further, and giving the appearance of low entropy initial conditions.

Jason

On Sat, Sep 14, 2019, 8:12 AM Alan Grayson <agrays...@gmail.com> wrote:
If the early universe, say before the emergence of the CMBR, consisted of a random collection of electrons and photons, wouldn't this correspond to a high, not low entropy? Wouldn't it be analogous to gas with many possible states? Yet cosmologists seem hard pressed to explain an initial or early state assuming the entropy is low. AG

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/121e0021-ebf2-4f04-a720-72e18ba0a6aa%40googlegroups.com.

Alan Grayson

unread,
Sep 14, 2019, 11:07:28 AM9/14/19
to Everything List


On Saturday, September 14, 2019 at 7:12:34 AM UTC-6, Alan Grayson wrote:
If the early universe, say before the emergence of the CMBR, consisted of a random collection of electrons and photons, wouldn't this correspond to a high, not low entropy? Wouldn't it be analogous to gas with many possible states? Yet cosmologists seem hard pressed to explain an initial or early state assuming the entropy is low. AG

Here's an easier question: when Boltzmann defined entropy as S = k * log N, why the log; why not just k*N? AG

Lawrence Crowell

unread,
Sep 14, 2019, 2:09:26 PM9/14/19
to Everything List
On Saturday, September 14, 2019 at 8:12:34 AM UTC-5, Alan Grayson wrote:
If the early universe, say before the emergence of the CMBR, consisted of a random collection of electrons and photons, wouldn't this correspond to a high, not low entropy? Wouldn't it be analogous to gas with many possible states? Yet cosmologists seem hard pressed to explain an initial or early state assuming the entropy is low. AG

The inflationary manifold has a cosmological constant Λ ≈ (1/cℓ_p)^2, where c is a number c < 1 and with a value of around 10^{-3}  This means Λ ≈ 10^{64}m^{-2}, which is to be compared to today's value of Λ ≈ 10^{-52}m^{-2}. What is still studied is the process by which this false vacuum, or false cosmological constant, transitioned to the value today. If we take the scenario that inflation started at 10^{-36} sec this would mean there was a transition of sorts from Λ ≈ 10^{64}m^{-2} → 10^{56}m^{-2} and in one scenario this transitions in a slow role to a lower value and transitions again to the low value known. 

The amount of information in a region bounded by a cosmological horizon with area A is S ≈  kA/( cℓ_p)^2 = k/c^2 by the Bousso bound. This means a tiny region bounded by this cosmological horizon only about 10^{-32}m across had a total entropy of S ≈ k×10^{6} for k = 1.38×10^{-23}j/K. With this transition this was some 8 orders of magnitude larger. So as a result the entropy of the earliest universe was quite small.

LC

Jason Resch

unread,
Sep 14, 2019, 2:36:06 PM9/14/19
to Everything List
I don't know the relationship between heat and information, I think it is relevant to the Bekenstein bound and black hole information, and also the Landauer limit, but there's another definition of entropy in information theory: https://en.m.wikipedia.org/wiki/Entropy_(information_theory)

The information theoretical definition of entropy is measured in bits (binary digits).  The reason for the logarithm is it takes Log2(N) bits to represent N states.  There's nothing special about the base you can also say it takes Log10(N) decimal digits to encode a number N.

Jason

Jason Resch

unread,
Sep 14, 2019, 2:39:27 PM9/14/19
to Everything List

Alan Grayson

unread,
Sep 15, 2019, 1:13:27 AM9/15/19
to Everything List


On Saturday, September 14, 2019 at 7:12:34 AM UTC-6, Alan Grayson wrote:
If the early universe, say before the emergence of the CMBR, consisted of a random collection of electrons and photons, wouldn't this correspond to a high, not low entropy? Wouldn't it be analogous to gas with many possible states? Yet cosmologists seem hard pressed to explain an initial or early state assuming the entropy is low. AG

When I was an undergraduate I took a course in Classical Thermodynamics and recall being satisfied that entropy was well-defined. I never took a course in Classical Statistical Mechanics, but I've seen Boltzmann's equation for S and wonder how N, the number of possible states is defined. If we have a gas enclosed in a container, we can divide it into occupation cells of fixed volume to calcuate S. But why can't we double the number of cells by reducing their volume by half? How then is S well defined in the case of Classical Statistical Mechanics? TIA, AG

Lawrence Crowell

unread,
Sep 15, 2019, 7:36:13 AM9/15/19
to Everything List
There is the classical definition S = ∂E/∂T for isobaric systems. Yet in general entropy is a rather subjective and slippery concept. With the Boltzmann formula S = k log(Ω) for Ω the volume of phase space any uncertainty in Ω results in tiny errors because of the logarithm. 

LC

John Clark

unread,
Sep 16, 2019, 3:02:53 AM9/16/19
to everyth...@googlegroups.com
On Sat, Sep 14, 2019 at 11:07 AM Alan Grayson <agrays...@gmail.com> wrote:

> Here's an easier question: when Boltzmann defined entropy as S = k * log N, why the log; why not just k*N? AG

Because if you define Entropy with a log in there then it is additive for independent sources; the Entropy of a  coin toss is 1 bit so the Entropy of 10 coin tosses is 10 bits.

John K Clark
 

Russell Standish

unread,
Sep 16, 2019, 3:23:09 AM9/16/19
to Everything List
It actually isn't. The point bothered me too. The number of states is
basically V/h, where V is the volume of phase space occupied by the
system, and h a cell size. Therefore, entropy is

klog V - klog h

For a large range of values of h, the second term is just a negligible
constant offset to the total entropy. However, as h→0, entropy blows
up. And that what classical statistical mechanics tells you.

Enter quantum mechanics. Heisenberg's uncertainty relation tells us
that ΔxΔp ≥ ℏ, so in the above entropy formula, h is constrained to be
larger than ℏ³. Quantum mechanics saves classical statistical physics'
bacon. Nothing blows up.

--

----------------------------------------------------------------------------
Dr Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Senior Research Fellow hpc...@hpcoders.com.au
Economics, Kingston University http://www.hpcoders.com.au
----------------------------------------------------------------------------

Alan Grayson

unread,
Sep 16, 2019, 4:34:25 AM9/16/19
to Everything List


On Monday, September 16, 2019 at 1:23:09 AM UTC-6, Russell Standish wrote:
On Sat, Sep 14, 2019 at 10:13:27PM -0700, Alan Grayson wrote:
>
>
> On Saturday, September 14, 2019 at 7:12:34 AM UTC-6, Alan Grayson wrote:
>
>     If the early universe, say before the emergence of the CMBR, consisted of a
>     random collection of electrons and photons, wouldn't this correspond to a
>     high, not low entropy? Wouldn't it be analogous to gas with many possible
>     states? Yet cosmologists seem hard pressed to explain an initial or early
>     state assuming the entropy is low. AG
>
>
> When I was an undergraduate I took a course in Classical Thermodynamics and
> recall being satisfied that entropy was well-defined. I never took a course in
> Classical Statistical Mechanics, but I've seen Boltzmann's equation for S and
> wonder how N, the number of possible states is defined. If we have a gas
> enclosed in a container, we can divide it into occupation cells of fixed volume
> to calcuate S. But why can't we double the number of cells by reducing their
> volume by half? How then is S well defined in the case of Classical Statistical
> Mechanics? TIA, AG

It actually isn't. The point bothered me too. The number of states is
basically V/h, where V is the volume of phase space occupied by the
system, and h a cell size. Therefore, entropy is

klog V  - klog h

For a large range of values of h, the second term is just a negligible
constant offset to the total entropy. However, as h→0, entropy blows
up. And that what classical statistical mechanics tells you.

How could the second term be negligible for large values of h? AG 

Lawrence Crowell

unread,
Sep 16, 2019, 6:35:27 AM9/16/19
to Everything List
 Think of the case where you have binary strings of length n. How many possible binary string are there with that length? There are N = 2^n. The Boltzman log(N) is just the size of the macrostate, where there are 2^n possible microstates. This is where the entropy S = kn comes from, for the units of Planck area on the horizon count microstates. We have 

S = k ln(N) = k ln(2^n) = k n ln(2).

With the black hole horizon or any horizon this linear chain is replaced by a two dimensional table or matrix. The same argument carries over.

LC
Reply all
Reply to author
Forward
0 new messages