Why LinkCapacityEstimator still use EWMA, not use EWMA & mean and deviation?

17 views
Skip to first unread message

lan muming

unread,
Nov 25, 2025, 10:11:14 AM (4 days ago) Nov 25
to discuss-webrtc
My question is link estimate why not use EWMA & mean and deviation?
code as per the annotated section below
detailed code as below:
void LinkCapacityEstimator::Update(DataRate capacity_sample, double alpha /*, double h */) {
  double sample_kbps = capacity_sample.kbps();
  if (!estimate_kbps_.has_value()) {
    // aver_estimate_kbps = sample_kbps;
    // var_estimate_kbps = sample_kbps;
    estimate_kbps_ = sample_kbps;
  } else {
    // aver_estimate_kbps = (1 - alpha) * aver_estimate_kbps + alpha * sample_kbps;
    // var_estimate_kbps = (1 - h)var_estimate_kbps + (h)(|estimate_kbps_.value() - aver_estimate_kbps|);
    // estimate_kbps_ = aver_estimate_kbps + 4 * var_estimate_kbps;
    estimate_kbps_ = (1 - alpha) * estimate_kbps_.value() + alpha * sample_kbps;
  }
  // Estimate the variance of the link capacity estimate and normalize the
  // variance with the link capacity estimate.
  const double norm = std::max(estimate_kbps_.value(), 1.0);
  double error_kbps = estimate_kbps_.value() - sample_kbps;
  deviation_kbps_ =
      (1 - alpha) * deviation_kbps_ + alpha * error_kbps * error_kbps / norm;
  // 0.4 ~= 14 kbit/s at 500 kbit/s
  // 2.5f ~= 35 kbit/s at 500 kbit/s
  deviation_kbps_ = rtc::SafeClamp(deviation_kbps_, 0.4f, 2.5f);
}
Reply all
Reply to author
Forward
0 new messages