Dear all,
Can anyone explain what is the rationale behind the objective in loss based bwe v2? The code is below:
LossBasedBweV2::GetObjective
const ChannelParameters& channel_parameters) const {
...
if (config_->use_byte_loss_rate) {
objective +=
temporal_weight *
((ToKiloBytes(observation.lost_size) * std::log(loss_probability)) +
(ToKiloBytes(observation.size - observation.lost_size) *
std::log(1.0 - loss_probability)));
objective +=
temporal_weight * high_bandwidth_bias * ToKiloBytes(observation.size);
} else {
...
}
The objective is basically the negated entropy with a scale factor plus a bias. To maximize the objective, won't we simply make loss_probability = 0 or 1?
Anyone can explain the rationale behind it?
Thanks.