in reply to Timing Web Page Requests
Most pages of interest aren't trivial, which means that after loading the base page, you need to parse it to find style sheets, frames, images, etc., so that you can load the additional components. This isn't a trivial exercise, particularly if the images are getting loaded by way of JavaScript (say, for rollovers), or via a style sheet. Simulating a browser can take a lot of work; you can sink a lot time into parsing JavaScript and CSS.
To further complicate things, there's caching and session "keep-alive" behavior to emulate.
And if you want to get a really accurate read on how long it takes to read a page, you have to pull out the big guns, and analyze packet traces. Probably more than you need, but it's insightful to do at least once.
So, how accurate a timing do you need?
Update: And if you're counting bytes, don't forget to count both the HTTP header and the TCP/IP header. It's surprisingly expensive, in terms of packet and data exchanged, to verify that a .GIF you have cached locally hasn't changed.
|
|---|