I've been working on some tools to extract response times from TCP
traffic, and will be developing push-button ability ;-) to do USL
modeling on that. One outstanding item is a curiosity over the
inaccuracy of timings gotten from tcpdump. It pretends to show
microsecond accuracy, but of course that's not so, because of TCP
buffering by the kernel and such. I'm not sure how much I should care
about this.
I'm really itching to learn what Neil presented at Hotsos this year,
tying the USL and response times together. I couldn't be there and
I've been pining over that. Or maybe that's whining. Does anyone,
such as the fabled DrQ himself, have some materials I could study to
learn about this? If so, maybe I'll see some more possibilities for
this toolset.
I'm trying to learn R, too. So far I can about half read other
people's code, but I'm a far cry from knowing enough to write any
myself.
Baron