I think that this post (derived from another) deserves a thread, to analyze
it deeper.
---------------------------------------------------------------------------------------------------------------
Roberts, that's not the link to the original 2015 paper from Mudrak (which I downloaded last Saturday), but is
an analysis over Mudrak paper and consequences, probably from the same year 2015.
I've tried to find a free download link, but I wasn't able.
The equations Mudrak used are:
Δf/f = −μ/(c².Rs) + μ/(c².Re) - J2 . μ/(c².Re) , for gravitational shift.
Δf/f = −Vs²/(2.c²) + (a.Ωe)²/(2.c²) , for SR time dilation.
where
μ =
G.ME = 3.986004418×10^14 m³/s²
c = 299792458 m/s
Rs = 26936715 m (Galileo satellite orbit radius)
Re = 6378136.55 m (Earth radius)
J2 = 0.0010826267 (Oblate Earth gravity coefficient)
Vs = 3669.6 m/s (Galileo satellite average velocity)
a = 63.83678 m/s (Undisclosed coefficient)
Ωe = 7.2921151467 rad/s (mean angular rotation rate of the Earth)
When applied into the formulae, it gives:
Δf/f(Galileo Total) = 5.3146E-10 -7.37090E-11 , and this produces a daily time drift
Δt(Galileo Total)/day = 47.1982 μsec - 6.3685 μsec = 40.8298 μsec
If GPS coefficients are applied:
Rs = 26936715 m (GPS satellite orbit radius)
Vs = 3669.6 m/s (GPS satellite average velocity)
it gives
Δt(GPS Total)/day = 45.9177 μsec - 7.2141 μsec = 38.7036 μsec
Still, the "a" factor is unexplained on the paper, and I don't understand its origin. If you eliminate the
2nd. order Ωe part, a difference of less than 1.5% occurs in the SR formula, which reduces to a first
degree approximation of Lorentz transform:
Δt/t = −Vs²/(2.c²)
Excerpt from 2015 Mudrak paper:
------------------------------------------------------------------------------------------------------------------------
4. Relativistic Corrections Implemented in Galileo
The relativistic effects analysed so far would introduce significant errors in the ranging and consequently
positioning accuracies of a GNSS system, unless they are carefully taken into account in the system design
and user algorithms.
In GPS and GLONASS the systematic relativistic net effect (i.e. the combination of the gravitational
frequency shift and the time dilation due to the orbital motion of the satellite) is compensated by adequately
offsetting the onboard clocks before launch, while time-varying effects are corrected at user receiver level.
It is anticipated that THIS IS NOT the approach in the Galileo system. As explained in detail in the following
paragraphs, the correction of relativistic errors is performed at user receiver level, on the basis of information
broadcasted through the navigation message.
------------------------------------------------------------------------------------------------------------------------
So, by year 2015, Mudrak, De Simone, and Lisi were writing the paper "Relativistic Corrections in the European
GNSS Galileo" and were critics of the decision about not to implement a solution of pre-tuning like in GPS.
As far as I could find, this critic was ignored and Galileo doesn't use fixed pre-launch corrections, and leave
the compensation to the manufacturers of Galileo receivers (for industrial, scientific or commercial use).
Do the manufacturers implement such corrections? It's what really matter to find out, because in the GPS world
the truth about this is obscured by Terabytes of data since 1985 (and still some data is confidential). It can be
to protect IP or to hide the truth.
*******************************************************************
More data and thoughts:
------------------------------------------------------------------------------------------------------------
1) In 1945, Isidor Rabi first publicly suggested that atomic beam magnetic
resonance might be used as the basis of a clock.
2) First functional atomic clock: It was an ammonia absorption line device
at 23870.1 MHz built in 1949 at the U.S. National Bureau of Standards
(NBS, now NIST). It was less accurate than existing quartz clocks, but
served to demonstrate the concept.
3) In 1955 Louis Essen and Jack Parry produced the first practical cesium
atomic frequency standard.
Einstein, with help of friends at Berlin and Princeton, invested 30+ years
of his late life trying to unify gravitational theory with QM theories.
And yet, not one single scientist between 1913 and 1975 questioned
the influence of gravity on emission of spectral lines? Maybe, between
1925 and 1945, the physics community wasn't exactly aware of
hyperfine transitions. After all, they had enough problems with fine
transitions that plagued the QM world, and make Schrodinger to
abandon his own work by mid 1930's, tired of such occurrences.
But there is no excuse, since 1945, for any physicist for not having
thinking about atomic clocks and height, in particular after 30 years
of General Relativity being around.
After all, since 1907, Einstein anticipated Gravitational time dilation, as
a consequence of special relativity in accelerated frames of reference.
So, where is the truth on this GNSS(GPS, GLONASS, Galileo, etc.)
problem with relativity? True or false?