WEB Advent 2009 / Performance for Web Apps

When it comes to programming, there are two things that I enjoy doing the most: making things faster and improving security. The two sometimes come into conflict, because security often requires validation overhead, and performance often leads to cutting corners. Balancing the two is often loads of “fun.”

That said, when it comes to security, things are relatively straightforward. Filter your input, escape your output, and don’t trust your users. On the performance side, things are not nearly as straightforward, especially when it comes to web apps.

The basic approach is fairly simple; just profile your app, determine the slow areas, optimize them, and voila, things are faster. Alas, even when this approach is followed, the results don’t necessarily eliminate the “your site is slow” feedback.

Browser bottleneck

For any app, the measure of speed is ultimately all about user perception. In a traditional environment where binaries are deployed on a user’s machine, you control the app, so optimizing the interaction is part of the optimization process. If a report screen takes too long to load, your profiling efforts will identify that, and you speed up the rendering function.

For web apps, things are a little different, because the UI is controlled by a web browser — a 3rd party app that you have no control over — and it is responsible for rendering your data for your users. Even if you output your content in one hundredth of a second, by the time the content is downloaded, parsed, executed, and rendered, a few seconds may have passed, and the user experience is diminished. This is something most web app optimization attempts miss, since they focus almost exclusively on how quickly content is generated.

In fact, for web apps, it is often better to start your optimization efforts from the user side of things rather than the server.

Making things better

Making the user experience better — and by better I mean faster — is not too complicated.

As with any optimization, the first step is to understand the nature of the problem. Good tools include Firebug for Firefox or Google Chrome. Using either, you can understand how big a page is, what additional resources (CSS, JavaScript, images, &c.) it loads — in what order and how quickly. This will give you a base to work with and identify areas of greatest interest. What follows are some easy improvements, ordered by ease of implementation.

Load time optimizations

Enable compression using something like mod_gzip. It’ll speed up page loading by significantly reducing the size of text content. Send caching headers (Expires, Last-Modified, ETag, &c.) for static content (images, CSS, JavaScript, &c.) to reduce redundant requests. Optimize images; in most cases, you can reduce the size of images by up to 30% with non-visible, lossy compression, color reductions, and the like. Compress JavaScript using tools such as JSMin, and compress CSS using YUI Compressor.

Check your cookies

While broadband connections can download hundreds of kilobytes per second, uploading is another matter entirely; most DSL and cable connections can only upload 20 to 30 kilobytes per second. If your page contains 10 resources, and the size of all the cookies is a mere 1 kilobyte, sending the requests alone can still take over half a second. Keep in mind there are also 400 to 500 bytes of headers per request. Therefore, it is important that you reduce your cookie sizes to just the session identifier. A slightly more complex solution is to move your static content to another domain, so your cookies are only sent to the pages that need them instead of all of the static resources as well.

Reduce request count

Merge your images; rather then sending a dozen 1 kilobyte images, combine them into a single image, and use CSS to show parts of the image. If you are loading more than 2 JavaScript or CSS files per page, considering merging them into a single file. This will makes things faster, but don’t merge too much, otherwise you’ll introduce parsing overhead.

Improve JavaScript loading

After you’ve merged your common JavaScript code into a single file, you may still have other JavaScript files that you load in the specific pages that need them. Unfortunately, browsers don’t load JavaScript files in parallel like they do with images. To prevent dependency breakage, most browsers load sequentially. If you have multiple JavaScript files in the <head> section, this can substantially slow down page loading, because the body won’t be rendered until all of them are loaded. To avoid this, you can use PHP to output a small JavaScript code snippet inside <head> that will register a window.onload callback that writes the necessary <script> tags to load the necessary JavaScript files. This will effectively allow you to defer the loading and parsing of JavaScript until the page is loaded.

These small optimizations can make a big difference. I hope they’re helpful!

Other posts