Measuring service calls response time in Fiddler and C# application

502 views
Skip to first unread message

Alex Under

unread,
Sep 30, 2013, 8:29:54 AM9/30/13
to httpf...@googlegroups.com, Olexander Valetsky
I'm looking for way to measure service call response time in Fiddler so it included all stages of process (creating, sending, getting response). Like this:

   
 var start = DateTime.Now;
   
// client is auto-generated C# SoapHttpClientProtocol proxy for service
   
var response = client.GetWebMethod();
   
var finish = DateTime.Now;
   
var elapsed = (finish - start).TotalMilliseconds;


Documentation proposes using difference between ClientDoneRequest and ClientDoneResponse timers: 

   
var elapsed = (oSession.Timers.ClientDoneResponse - Session.Timers.ClientDoneRequest).TotalMilliseconds;


Results I'm getting differ around 100%, and Fiddler's values are surprisingly two times smaller (expected vice versa as it's proxy that had to have some overhead for passing requests).
It's more like I'm looking for ClientDoneResponse - ClientStartRequest here, but values for both of this timers (ClientStartRequest and ClientDoneRequest) are absolutely equal in my case. Any ideas how to get at least approximately close numbers in Fiddler? Thanks in advance.

EricLaw

unread,
Oct 2, 2013, 2:04:51 AM10/2/13
to httpf...@googlegroups.com, Olexander Valetsky

This question was also asked here: http://stackoverflow.com/questions/19094272/measuring-wcf-service-calls-response-time-in-fiddler-and-c-sharp-application

In your code, you're using DateTime.Now, which is limited to the Windows clock resolution (15.7ms). For higher precision, you should use the Stopwatch class instead.ClientDoneResponse - ClientBeginRequest measures the time between the client sending the first TCP/IP packet to Fiddler and the time of Fiddler sending the final TCP/IP packet to the client.


Reply all
Reply to author
Forward
0 new messages