I am developing a program that reads a video file on the server side and transfers it to the client side for writing.
When I introduce network delay or packet loss, the total communication time increases significantly, leading to results that seem abnormal compared to standard TCP.
According to my Wireshark analysis, I noticed that it takes about 1 second for the retransmission to start in some cases. I haven't been able to identify why the latency causes this specific behavior.
Could there be an issue within my implementation, or is this a known behavior in this environment? I would appreciate any advice.