· 2 min read

Measuring web client performance

Today I was asked about the best way to measure client-side performance for heavily-used web applications. Since I spend more of my time in Java/J2EE I pretty much end up using tools written for Java. I have found and use two primary tools for client-side performance measurement: IBM’s Page Detailer (available at AlphaWorks) and the Charles Web Debugging Toolkit (available at SourceForge). Of the two, I’ve found IBM’s page detailer to do what almost no other tool does — provide accurate samples of “page weights” or how heavy a page is with respect to the number of resources it contains. For example, if you look at any web page it will consist of HTML (one object) plus numerous other objects like JavaScript, CSS, and other related “external resources”. Everyone of the external resources will require a connection to the webserver and if you count up those extra “hits” it will end up slowing your pages down. The IBM Page Detailer utility, which works as a proxy server, is an excellent tool to help you measure and therefore reduce the size of your pages’ external resources. The Charles Web Debugger is a great way to simulate, using software-based throttling, how your end user’s will perceive your application’s performance. It works as a proxy server to intercept your calls and provide you all kinds of statistics about page size, number of connections, speed, etc. If you need to measure client-side performance (and who doesn’t!) you should definitely give both the IBM Page Detailer and the Charles Web Debugger a spin.

    Share:
    Back to Blog