I finally was able to identify and resolve my confusion about Hubble's Law. First, let's use a geometric model to establish that the recessional velocity of distant galaxies increases as the universe expands. For convenience, assume the universe is spherically shaped and uniformly expanding, and consider two galaxies at distances of one and ten billion light years removed from our own. As r, the radius of the universe increases linearly, so will the separation distances of these remote galaxies, since the arc distances to these galaxies, if they placed e.g, on the equator, will also increase linearly. So in some unit of time, if say the rate of increase is 10%, the closer galaxy will recede by 10% of 1 billion light years, or 100 milllion light years, whereas the most distant galaxy will recede 1 billion light years in the same time duration. So clearly, in an expanding universe, more distant galaxies will recede faster than nearer galaxies.
Let's now consider the light emitted from these galaxies. The light reaching us left those galaxies 1 and 10 billion years ago respectively. If their red shifts represent their recessional velocities when the light was emitted, it would imply that in the early universe those galalaxies were receding very rapidly, the farther away in time they are, that is the more distant they are, the more rapidly they must be receding.
On 8/7/2025 10:17 PM, Alan Grayson wrote:
I finally was able to identify and resolve my confusion about Hubble's Law. First, let's use a geometric model to establish that the recessional velocity of distant galaxies increases as the universe expands. For convenience, assume the universe is spherically shaped and uniformly expanding, and consider two galaxies at distances of one and ten billion light years removed from our own. As r, the radius of the universe increases linearly, so will the separation distances of these remote galaxies, since the arc distances to these galaxies, if they placed e.g, on the equator, will also increase linearly. So in some unit of time, if say the rate of increase is 10%, the closer galaxy will recede by 10% of 1 billion light years, or 100 milllion light years, whereas the most distant galaxy will recede 1 billion light years in the same time duration. So clearly, in an expanding universe, more distant galaxies will recede faster than nearer galaxies.
Let's now consider the light emitted from these galaxies. The light reaching us left those galaxies 1 and 10 billion years ago respectively. If their red shifts represent their recessional velocities when the light was emitted, it would imply that in the early universe those galalaxies were receding very rapidly, the farther away in time they are, that is the more distant they are, the more rapidly they must be receding.Why not phrase this as the equally true statement, "The more distant they are the more rapidly we must be receding.", which is then consistent with your first paragraph?
Anyway, I'm glad you resolved it to your own satisfaction.
Brent
On Friday, August 8, 2025 at 2:10:19 PM UTC-6 Brent Meeker wrote:
On 8/7/2025 10:17 PM, Alan Grayson wrote:
I finally was able to identify and resolve my confusion about Hubble's Law. First, let's use a geometric model to establish that the recessional velocity of distant galaxies increases as the universe expands. For convenience, assume the universe is spherically shaped and uniformly expanding, and consider two galaxies at distances of one and ten billion light years removed from our own. As r, the radius of the universe increases linearly, so will the separation distances of these remote galaxies, since the arc distances to these galaxies, if they placed e.g, on the equator, will also increase linearly. So in some unit of time, if say the rate of increase is 10%, the closer galaxy will recede by 10% of 1 billion light years, or 100 milllion light years, whereas the most distant galaxy will recede 1 billion light years in the same time duration. So clearly, in an expanding universe, more distant galaxies will recede faster than nearer galaxies.
Let's now consider the light emitted from these galaxies. The light reaching us left those galaxies 1 and 10 billion years ago respectively. If their red shifts represent their recessional velocities when the light was emitted, it would imply that in the early universe those galalaxies were receding very rapidly, the farther away in time they are, that is the more distant they are, the more rapidly they must be receding.Why not phrase this as the equally true statement, "The more distant they are the more rapidly we must be receding.", which is then consistent with your first paragraph?
Anyway, I'm glad you resolved it to your own satisfaction.
Brent
Thanks for your kind thought, but unfortunately I am still confused. I think the geometric model is conclusive; the more distant a galaxy is, the more rapid is its recessional velocity, which is Hubble's Law. Moreover, considering the red shifts of two galaxies of different distances, from the pov of time moving forward, there is slowing of recessional velocity due to gravity (ignoring the speed up discovered in 1998). But my problem arises when I consider time flowing backward, where in remote times the recessional velocity inferred from the red shift is huge. Clark seems to be of two minds on this; he has stated that in very early times, after the manifestation of the CMB of course, the galaxies were very close and receding from each other slowly; and once recently he stated the opposite. What, IYO, is going on the very early universe wrt recessional velocities, and why? TY, AG
But this contradicts the geometric model, wherein we have inferentially proven the opposite; that in early times, those galaxies were receding with decreasing velocity as their separation distances from us was decreasing. So what the hell is going on?
The answer is that although the light emitted from those galaxies was emitted in the distant past, the expansion of the universe distorted those emissions as they propagated in our direction. That is, the red shifts observed were caused by the expansion of the universe, and therefore represents the current red shifts of those receding galaxies.
AG
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/everything-list/f053be49-da62-4484-b6a0-3b8ab521ae2bn%40googlegroups.com.
On 8/8/2025 8:38 PM, Alan Grayson wrote:
On Friday, August 8, 2025 at 2:10:19 PM UTC-6 Brent Meeker wrote:
On 8/7/2025 10:17 PM, Alan Grayson wrote:
I finally was able to identify and resolve my confusion about Hubble's Law. First, let's use a geometric model to establish that the recessional velocity of distant galaxies increases as the universe expands. For convenience, assume the universe is spherically shaped and uniformly expanding, and consider two galaxies at distances of one and ten billion light years removed from our own. As r, the radius of the universe increases linearly, so will the separation distances of these remote galaxies, since the arc distances to these galaxies, if they placed e.g, on the equator, will also increase linearly. So in some unit of time, if say the rate of increase is 10%, the closer galaxy will recede by 10% of 1 billion light years, or 100 milllion light years, whereas the most distant galaxy will recede 1 billion light years in the same time duration. So clearly, in an expanding universe, more distant galaxies will recede faster than nearer galaxies.
Let's now consider the light emitted from these galaxies. The light reaching us left those galaxies 1 and 10 billion years ago respectively. If their red shifts represent their recessional velocities when the light was emitted, it would imply that in the early universe those galalaxies were receding very rapidly, the farther away in time they are, that is the more distant they are, the more rapidly they must be receding.Why not phrase this as the equally true statement, "The more distant they are the more rapidly we must be receding.", which is then consistent with your first paragraph?
Anyway, I'm glad you resolved it to your own satisfaction.
Brent
Thanks for your kind thought, but unfortunately I am still confused. I think the geometric model is conclusive; the more distant a galaxy is, the more rapid is its recessional velocity, which is Hubble's Law. Moreover, considering the red shifts of two galaxies of different distances, from the pov of time moving forward, there is slowing of recessional velocity due to gravity (ignoring the speed up discovered in 1998). But my problem arises when I consider time flowing backward, where in remote times the recessional velocity inferred from the red shift is huge. Clark seems to be of two minds on this; he has stated that in very early times, after the manifestation of the CMB of course, the galaxies were very close and receding from each other slowly; and once recently he stated the opposite. What, IYO, is going on the very early universe wrt recessional velocities, and why? TY, AGWhich galaxies?
Galaxies that were close to each other were receding slowly.
Now that they're far from each other they are receding more rapidly. This is a little different from Hubble's idea that obtained up until the '90s. The early model was that the Big Bang provided an impetus and the galaxies flew apart while gradually slowing due to gravity.
So the furthest galaxies were furthest because they had been furthest and the universe had just uniformly expanded, rapidly at first but gradually slowing due to gravity.
But now it seems that the expansion has not been uniform and following an initial rapid expansion there was period of things just coasting apart followed by the current period of increasing expansion rate of space. It's a matter of fitting models to the observations and observations have been reaching back further and further. Keep in mind that "expansion rate of the universe" doesn't usually mean the recession rate of galaxies; it means a fitted Hubble parameter...like this:
On 8/8/2025 8:38 PM, Alan Grayson wrote:
On Friday, August 8, 2025 at 2:10:19 PM UTC-6 Brent Meeker wrote:
On 8/7/2025 10:17 PM, Alan Grayson wrote:
I finally was able to identify and resolve my confusion about Hubble's Law. First, let's use a geometric model to establish that the recessional velocity of distant galaxies increases as the universe expands. For convenience, assume the universe is spherically shaped and uniformly expanding, and consider two galaxies at distances of one and ten billion light years removed from our own. As r, the radius of the universe increases linearly, so will the separation distances of these remote galaxies, since the arc distances to these galaxies, if they placed e.g, on the equator, will also increase linearly. So in some unit of time, if say the rate of increase is 10%, the closer galaxy will recede by 10% of 1 billion light years, or 100 milllion light years, whereas the most distant galaxy will recede 1 billion light years in the same time duration. So clearly, in an expanding universe, more distant galaxies will recede faster than nearer galaxies.
Let's now consider the light emitted from these galaxies. The light reaching us left those galaxies 1 and 10 billion years ago respectively. If their red shifts represent their recessional velocities when the light was emitted, it would imply that in the early universe those galalaxies were receding very rapidly, the farther away in time they are, that is the more distant they are, the more rapidly they must be receding.Why not phrase this as the equally true statement, "The more distant they are the more rapidly we must be receding.", which is then consistent with your first paragraph?
Anyway, I'm glad you resolved it to your own satisfaction.
Brent
Thanks for your kind thought, but unfortunately I am still confused. I think the geometric model is conclusive; the more distant a galaxy is, the more rapid is its recessional velocity, which is Hubble's Law. Moreover, considering the red shifts of two galaxies of different distances, from the pov of time moving forward, there is slowing of recessional velocity due to gravity (ignoring the speed up discovered in 1998). But my problem arises when I consider time flowing backward, where in remote times the recessional velocity inferred from the red shift is huge. Clark seems to be of two minds on this; he has stated that in very early times, after the manifestation of the CMB of course, the galaxies were very close and receding from each other slowly; and once recently he stated the opposite. What, IYO, is going on the very early universe wrt recessional velocities, and why? TY, AGWhich galaxies? Galaxies that were close to each other were receding slowly. Now that they're far from each other they are receding more rapidly.
On Friday, August 8, 2025 at 10:24:01 PM UTC-6 Brent Meeker wrote:
On 8/8/2025 8:38 PM, Alan Grayson wrote:
On Friday, August 8, 2025 at 2:10:19 PM UTC-6 Brent Meeker wrote:
On 8/7/2025 10:17 PM, Alan Grayson wrote:
I finally was able to identify and resolve my confusion about Hubble's Law. First, let's use a geometric model to establish that the recessional velocity of distant galaxies increases as the universe expands. For convenience, assume the universe is spherically shaped and uniformly expanding, and consider two galaxies at distances of one and ten billion light years removed from our own. As r, the radius of the universe increases linearly, so will the separation distances of these remote galaxies, since the arc distances to these galaxies, if they placed e.g, on the equator, will also increase linearly. So in some unit of time, if say the rate of increase is 10%, the closer galaxy will recede by 10% of 1 billion light years, or 100 milllion light years, whereas the most distant galaxy will recede 1 billion light years in the same time duration. So clearly, in an expanding universe, more distant galaxies will recede faster than nearer galaxies.
Let's now consider the light emitted from these galaxies. The light reaching us left those galaxies 1 and 10 billion years ago respectively. If their red shifts represent their recessional velocities when the light was emitted, it would imply that in the early universe those galalaxies were receding very rapidly, the farther away in time they are, that is the more distant they are, the more rapidly they must be receding.Why not phrase this as the equally true statement, "The more distant they are the more rapidly we must be receding.", which is then consistent with your first paragraph?
Anyway, I'm glad you resolved it to your own satisfaction.
Brent
Thanks for your kind thought, but unfortunately I am still confused. I think the geometric model is conclusive; the more distant a galaxy is, the more rapid is its recessional velocity, which is Hubble's Law. Moreover, considering the red shifts of two galaxies of different distances, from the pov of time moving forward, there is slowing of recessional velocity due to gravity (ignoring the speed up discovered in 1998). But my problem arises when I consider time flowing backward, where in remote times the recessional velocity inferred from the red shift is huge. Clark seems to be of two minds on this; he has stated that in very early times, after the manifestation of the CMB of course, the galaxies were very close and receding from each other slowly; and once recently he stated the opposite. What, IYO, is going on the very early universe wrt recessional velocities, and why? TY, AGWhich galaxies? Galaxies that were close to each other were receding slowly. Now that they're far from each other they are receding more rapidly.So if we go back in time the recession is slowing. Doesn't this contradict Hubble's Law which IIUC says the opposite, that for each additional megaparsec in the distance past, the velocity of expansion increases by about 70 km/sec? AGThis is a little different from Hubble's idea that obtained up until the '90s. The early model was that the Big Bang provided an impetus and the galaxies flew apart while gradually slowing due to gravity.
So the furthest galaxies were furthest because they had been furthest and the universe had just uniformly expanded, rapidly at first but gradually slowing due to gravity. But now it seems that the expansion has not been uniform and following an initial rapid expansion there was period of things just coasting apart followed by the current period of increasing expansion rate of space.