Hi everyone,
I noticed that the values of a.MinRTT for NDT5 measurements seem too low.
For example 45.237.70.122 is an IP behind a satellite link that can be pinged at around ~500 ms from Sao Paulo.
Looking at the raw data it seems that a.MinRTT is equal to raw.S2C.TCPInfo.MinRTT divided by 1,000,000, which gives an (impossible) value of 0.599ms:
SELECT date, raw.ClientIP, a.MinRTT, raw.S2C.MinRTT, raw.S2C.TCPInfo.MinRTT
FROM `measurement-lab`.ndt.ndt5
WHERE date = '2023-11-01' AND raw.ClientIP = '45.237.70.122'
[
{
"date":"2023-11-01",
"ClientIP":"45.237.70.122",
"MinRTT":"0.599",
"MinRTT_1":"716000000",
"MinRTT_2":"599000"
}
]
However we get much more likely values if we divide by 1000 instead. Here are the quantiles on the whole dataset for a single day:
SELECT
APPROX_QUANTILES(a.MinRTT, 10) a_min_rtt,
APPROX_QUANTILES(raw.S2C.TCPInfo.MinRTT / 1000, 10) raw_min_rtt
FROM `measurement-lab`.ndt.ndt5
WHERE date = '2023-01-01'
[
{
"a_min_rtt":[
"-1.0",
"-1.0",
"-1.0",
"-1.0",
"-1.0",
"0.0010680000000000002",
"0.008627000000000001",
"0.019338",
"0.041866999999999994",
"0.11881699999999999",
"4294.967295"
],
"raw_min_rtt":[
"0.07",
"5.294",
"8.57",
"12.981",
"19.206",
"27.687",
"41.666",
"79.015",
"119.876",
"143.79",
"4294967.295"
]
}
]
The values divided by 1000 look more plausible than the a.MinRTT value ( raw.S2C.TCPInfo.MinRTT / 1,000,000).
Best,
Maxime