10 Ways to Improve Your Web Page Performance

Dec 22 2008 by Jacob Gube | 51 Comments

There are a million and one ways to boost your website’s performance. The methods vary and some are more involved than others. The three main areas that you can work on are: hardware (your web server), server-side scripting optimization (PHP, Python, Java), and front-end performance (the meat of the web page).

This article primarily focuses on front-end performance since it’s the easiest to work on and provides you the most bang for your buck.

Why focus on front-end performance?

The front-end (i.e. your HTML, CSS, JavaScript, and images) is the most accessible part of your website. If you’re on a shared web hosting plan, you might not have root (or root-like) access to the server and therefore can’t tweak and adjust server settings. And even if you do have the right permissions, web server and database engineering require specialized knowledge to give you any immediate benefits.

It’s also cheap. Most of the front-end optimization discussed can be done at no other cost but your time. Not only is it inexpensive, but it’s the best use of your time because front-end performance is responsible for a very large part of a website’s response time.

With this in mind, here are a few simple ways to improve the speed of your website.

1. Profile your web pages to find the culprits.

Firebug screen shot.

It’s helpful to profile your web page to find components that you don’t need or components that can be optimized. Profiling a web page usually involves a tool such as Firebug to determine what components (i.e. images, CSS files, HTML documents, and JavaScript files) are being requested by the user, how long the component takes to load, and how big it is. A general rule of thumb is that you should keep your page components as small as possible (under 25KB is a good target).

Firebug’s Net tab (shown above) can help you hunt down huge files that bog down your website. In the above example, you can see that it gives you a break down of all the components required to render a web page including: what it is, where it is, how big it is, and how long it took to load.

There are many tools on the web to help you profile your web page – check out this guide for a few more tools that you can use.

2. Save images in the right format to reduce their file size.

JPEG vs GIF screenshot.

If you have a lot of images, it’s essential to learn about the optimal format for each image. There are three common web image file formats: JPEG, GIF, and PNG. In general, you should use JPEG for realistic photos with smooth gradients and color tones. You should use GIF or PNG for images that have solid colors (such as charts and logos).

GIF and PNG are similar, but PNG typically produces a lower file size. Read Coding Horror’s weigh-in on using PNG’s over GIF’s.

3. Minify your CSS and JavaScript documents to save a few bytes.

Minification is the process of removing unneeded characters (such as tabs, spaces, source code comments) from the source code to reduce its file size. For example:

This chuck of CSS:

.some-class {
  color: #ffffff;
  line-height: 20px;
  font-size: 9px;
}

can be converted to:

.some-class{color:#fff;line-height:20px;font-size:9px;}

…and it’ll work just fine.

And don’t worry – you won’t have to reformat your code manually. There’s a plethora of free tools available at your disposal for minifying your CSS and JavaScript files. For CSS, you can find a bunch of easy-to-use tools from this CSS optimization tools list. For JavaScript, some popular minification options are JSMIN, YUI Compressor, and JavaScript Code Improver. A good minifying application gives you the ability to reverse the minification for when you’re in development. Alternatively, you can use an in-browser tool like Firebug to see the formatted version of your code.

4. Combine CSS and JavaScript files to reduce HTTP requests

For every component that’s needed to render a web page, an HTTP request is created to the server. So, if you have five CSS files for a web page, you would need at least five separate HTTP GET requests for that particular web page. By combining files, you reduce the HTTP request overhead required to generate a web page.

Check out Niels Leenheer’s article on how you can combine CSS and JS files using PHP (which can be adapted to other languages). SitePoint discusses a similar method of bundling your CSS and JavaScript;they were able to shave off 1.6 seconds in response time, thereby reducing the response time by 76% of the original time.

Otherwise, you can combine your CSS and JavaScript files using good, old copy-and-paste‘ing (works like a charm).

5. Use CSS sprites to reduce HTTP requests

CSS Sprites on Digg.

A CSS Sprite is a combination of smaller images into one big image. To display the correct image, you adjust the background-position CSS attribute. Combining multiple images in this way reduces HTTP requests.

For example, on Digg (shown above), you can see individual icons for user interaction. To reduce server requests, Digg combined several icons in one big image and then used CSS to position them appropriately.

You can do this manually, but there’s a web-based tool called CSS Sprite Generator that gives you the option of uploading images to be combined into one CSS sprite, and then outputs the CSS code (the background-position attributes) to render the images.

6. Use server-side compression to reduce file sizes

This can be tricky if you’re on a shared web host that doesn’t already server-side compression, but to fully optimize the serving of page components they should be compressed. Compressing page objects is similar to zipping up a large file that you send through email: You (web server) zip up a large family picture (the page component) and email it to your friend (the browser) – they in turn unpack your ZIP file to see the picture. Popular compression methods are Deflate and gzip.

If you run your own dedicated server or if you have a VPS – you’re in luck – if you don’t have compression enabled, installing an application to handle compression is a cinch. Check out this guide on how to install mod_gzip on Apache.

7. Avoid inline CSS and JavaScript

By default, external CSS and JavaScript files are cached by the user’s browser. When a user navigates away from the landing page, they will already have your stylesheets and JavaScript files, which in turn saves them the need to download styles and scripts again. If you use a lot of CSS and JavaScript in your HTML document, you won’t be taking advantage of the web browser’s caching features.

8. Offload site assets and features

Amazon S3Fox for Six Revisions.

Unloading some of your site assets and features to third-party web services greatly reduces the work of your web server. The principle of offloading site assets and features is that you share the burden of serving page components with another server.

You can use Feedburner to handle your RSS feeds, Flickr to serve your images (be aware of the implications of offloading your images though), and the Google AJAX Libraries API to serve popular JavaScript frameworks/libraries like MooTools, jQuery, and Dojo.

For example, on Six Revisions I use Amazon’s Simple Storage Service (Amazon S3 for short), to handle the images you see on this page, as well as Feedburner to handle RSS feeds. This allows my own server to handle just the serving of HTML, CSS, and CSS image backgrounds. Not only are these solutions cost-effective, but they drastically reduce the response times of web pages.

9. Use Cuzillion to plan out an optimal web page structure

Cuzzillion screen shot.

Cuzillion is a web-based application created by Steve Souders (front-end engineer for Google after leaving Yahoo! as Chief of Performance) that helps you experiment with different configurations of a web page’s structure in order to see what the optimal structure is. If you already have a web page design, you can use Cuzillion to simulate your web page’s structure and then tweak it to see if you can improve performance by moving things around.

View InsideRIA’s video interview of Steve Sounders discussing how Cuzillion works and the Help guide to get you started quickly.

10. Monitor web server performance and create benchmarks regularly.

The web server is the brains of the operation – it’s responsible for getting/sending HTTP requests/responses to the right people and serves all of your web page components. If your web server isn’t performing well, you won’t get the maximum benefit of your optimization efforts.

It’s essential that you are constantly checking your web server for performance issues. If you have root-like access and can install stuff on the server, check out ab – an Apache web server benchmarking tool or Httperf from IBM.

If you don’t have access to your web server (or have no clue what I’m talking about) you’ll want to use a remote tool like Fiddler or HTTPWatch to analyze and monitor HTTP traffic. They will both point out places that are troublesome for you to take a look at.

Benchmarking before and after making major changes will also give you some insight on the effects of your changes. If your web server can’t handle the traffic your website generates, it’s time for an upgrade or server migration.

Further reading

Got more?

Like I said, there are a million ways and one to improve web page performance. If you’ve got a bit more input on the points above or have additional tips you’d like to discuss, please share it with us in the comments.

Related content

51 Comments

mnop

December 22nd, 2008

Thanks for nice tips. I would use this to my blog for sure. By the way, the link to “CSS optimization tools list.” seems broken.

bench

December 22nd, 2008

Hey, this is a very timely articles i was looking for something like this while trying to optimize my site this weekend. Maye I add that people should try to add as few javascript as possible?

I have a question regarding Amazon S3 to host my images. i have hosting my images with flickr with a pro account (no limitation on images as opposed to free account). If I use S3, I get charged by the bandwidth; which is fine by me, but if someone else starts linking to that image I will be paying through my nose. Any way to prevent that?

You have a quite a bit of traffic in this site (maybe an understatement), how much does your Amazon S3 bandwidth bill comes out to be?

Thanks for the info!

insic

December 22nd, 2008

nice tips. #8 is a good idea i might try this.

krishna

December 22nd, 2008

Very good article. Tweeted this one :)

Thibauld

December 22nd, 2008

Hi,

This is a great post, thanks for this! In case you’re interested, I also blogged on web application development methodology, focusing less on the technical part than on the general process needed before coding. I though you might find it an interesting read… Anyway, thanks for this post, I’ll sure check your blog regularly from now on!
Cheers,

Jacob Gube

December 22nd, 2008

@mnop: Oops, someone must have switched my coffee to decaf, thanks for letting me know. Fixed.

@Thibauld: Thanks, and that was an excellent piece!

Jim Gaudet

December 22nd, 2008

Funny, you hit the FP of Digg quite often, but no one comments on your kick ass post? Anyway, maybe no one sees the importance of speeding up their websites.

Great resources. I use HTTP watch myself. I have to redesign my blog, but my website loads pretty fast and is valid.

Jacob Gube

December 22nd, 2008

@bench: Hotlinking is an issue with Amazon S3, there’s no official way to prevent it. But there’s a way for you to authenticate the source URL to prevent hotlinking of images. This thread should help you get started with your research.

About Six Revision’s S3 bill, considering I post tons of images per post, with pages sometimes getting up to 500KB (being request a ton of times) – the bill is huge, but not more than it would be if I bought and maintained dedicated servers to host static files.

Anyone else got any input on Amazon S3 hotlinking, please do share it – I’d love to start a discussion on that.

Dan

December 22nd, 2008

Don’t forget about YSlow (https://addons.mozilla.org/en-US/firefox/addon/5369) as an add-on to Firebug/FF. It was also created by Steve Souders and covers many of these optimization techniques.

Jacob Gube

December 22nd, 2008

@Dan: You mentioned something that should’ve been included up there (I did link to another article that discusses YSlow, but I should have still mentioned it). Yep, YSlow’s a remnant of Steve Sounder’s work in Yahoo! He also started the Best Practices that’s the basis for YSlow.

Russell Heimlich

December 22nd, 2008

You can use .htaccess to prevent hotlinking of images and stealing of bandwidth -> http://altlab.com/htaccess_tutorial.html

Bryon Sutherland

December 22nd, 2008

You left out a big one: setting content expiration on the web server. If you set the ‘lifetime’ of some of your common images (like navigation images) to a day or more you can reduce the http requests dramatically.

khurram

December 22nd, 2008

For asp.net application in particular following techniques can be used.
http://www.codeproject.com/KB/aspnet/PeformanceAspnet.aspx

Jacob Gube

December 22nd, 2008

@Russell Heimlich: Great guide but I’m not too sure how applicable it is to Amazon S3. I’ve honestly never played around too much with Amazon S3.

@Bryon Sutherland: I realized that I forgot to mention this right after hitting the publish button (and I also realized that someone would eventually talk about it in the comments). You’re right, making sure you serve files that are static (like images) with (far) future Expires headers saves a lot of resources for primed caches. Thanks for the reminder! The one thing I don’t like about it is that you have to version your scripts so that you serve updated versions whenever you make changes (and then update the references to the new script name). Easy solution: Don’t set it for your JS/CSS files (though they stand to benefit the most with regards to future expires headers).

@khurram: Thanks for the cool guide for ASP.NET.

SC

December 23rd, 2008

Great article. It’s also worth looking at Google’s website optimiser when you’re looking at the site design to see how your users act when shown different variants of your webpage

Simon Vallee

December 23rd, 2008

Great post. Over and above just hosting images on S3, Amazon has a fully fledge content distribution network as well now.

Brian

December 23rd, 2008

I would add a fourth “main area” to focus on for performance: your database. Optimized application code and streamlined websites will still slow to a crawl if your database is poorly indexed or if your queries are poorly formed.

Bob (Buffone)

December 23rd, 2008

Rockstarapps.com has an Eclipse based tool that makes #3,4,6,7 as easy as right clicking on group of files or selecting the tags in the HTML file. You can find more information at http://rockstarapps.com/pmwiki/pmwiki.php?n=JsLex.JsLex.

John Rockefeller

December 23rd, 2008

A good post. If more people read this, the web would be much faster. One note about #8… By having your content come from multiple domains, you will be doing a DNS resolution for each of those domains, which does slow you down a bit. But, offloading your images and other larger files to higher-capacity servers more than makes up for this, usually… Unless you’ve got a slow DNS service with your ISP…

A neat feature of Google Chrome is that it caches DNS resolutions locally. So once it resolves a domain once it saves it for future use. Pretty neat!

Lee Munroe

December 23rd, 2008

Nice tips! Never really thought about the CSS sprites tip. Definitely worth going through this checklist for bigger sites. Cheers

inwate

December 24th, 2008

Your comment is awaiting moderation

Jacob Gube

December 24th, 2008

@SC: I’ve always wanted to try that, but I never get a chance to play around with it. For example, I want to see the effectiveness of having a “subscribe to RSS” link at the bottom of each post. Right now, I just track the click-through and compare it to the main RSS subscription button on top.

@Simon Vallee: Amazon’s doing a lot to push affordable content delivery and storage. S3′s enough for me, I just need Six Revisions to be stable when there’s burst traffic without sacrificing images in posts or the ability to comment.

@Brian: I think you’re absolutely right. One of the main issues I have with WordPress is that it doesn’t do enough when it comes to database optimization and database access. I’ve hacked and slashed a few core files to reduce the amount of resources WP requires.

@John Rockefeller: Yeah, there’s the trade-off to offloading, DNS resolution. Six Revisions gets penalized on YSlow because of all the re-directs it does. With Gravatars on the comments section and offloaded images, there can be as much as 300 objects sourced from another URL in just one article. And this is where experience and benchmarking comes in handy. In the case of Six Revisions – offloading site features works perfectly. I had to learn the hard way: crashing and burning, then re-tweaking until you get the set-up just right. 5 servers later and a lot of research and actual live testing, I finally got to that magical sweet spot (“site setup nirvana” as I call it), and now everything just runs itself. Front-end is key, at least in my situation. Database and web server optimization is much more important in large-scale rich internet apps where a user hits the server a ton of times per page and user action. On SR, you’ll only access the database when you leave a comment (since these pages are cached).

Farid Hadi

December 27th, 2008

Great topic to write about Jacob. I posted an article on the same topic back in August.

It’s probably a good thing to let people know that when a browser, all but IE8, is downloading an external JavaScript file it won’t download anything else until it’s done. So it’s a good idea to keep as few and small js files as possible.

Felicity

December 27th, 2008

Some great tips here, and something for me as a front-end developer to bear in mind.
It doesn’t matter if my beautifully commented css & js works perfectly in the prototype stage if it falls over on the live server because it is too heavy.

WebdesignerDepot

January 7th, 2009

I loved #5 and implemented it on my site after reading it here. I first saw this being used by Apple.com on their menu and always meant to implement it. Thanks for this one Jacob :)

Jacob Gube

January 8th, 2009

@WebdesignerDepot: Hey Walter, how’s it going? Glad you found this article useful – it’s just a collection of things I learned while trying to optimize Six Revisions.

ROW

January 12th, 2009

That were some good points.

Regarding #8, I don’t understand how feedburner can help reduce the load of webserver vis-a-vis my webhost. Probably I don’t understand RSS too well. My understanding was that RSS is a way to pull and syndicate data from your blog/site and makes compatible to be read in a feed reader. How it affects server, I’m unsure.

Would you mind explaining in a bit detail?

Thanks!

Jacob Gube

January 12th, 2009

@ROW: Two things:
1) If you host your own RSS feed, every time someone reads it through their RSS reader, it hits your server. If you used feedburner, it would hit their server.

2) Hosting your RSS feed takes processing requirements (server-side, like PHP) when you update content – which takes up system resources that you can outsource elsewhere (Feedburner).

Hope that helps a bit, I should have explained that a bit more in the article, but I was just using it as an example.

Amy

January 12th, 2009

Thanks for the tip on the Sprite generator. Balance between appearance and functionality is always an issue and this was really a helpful tool in that regard.

Dave

January 12th, 2009

Jacob,
Good Info. I learned something from #7. I had never thought about/realized the caching difference with separate files.

ROW

January 12th, 2009

Thanks Jacob for the nice explanation. ya, I can understand detailed explanation was not required in the post.

btw I would suggest you to use a “Subscribe to Comments” wordpress plugin, so that a commenter need not have to come the site to check for the response.

Thanks!

Jacob Gube

January 13th, 2009

@Dave: Yes. People always say, “never use inline JS or CSS”. The reason they usually give is that it’s a best practice for separating the structure (HTML) and presentational layer (JS/CSS). But me being the pragmatic developer – “because they said so” isn’t good enough. I would put all my styles and JS inline if it meant that I’d have a better-performing page and that I’d have an easier time maintaining a project. But because of practical and quantifiable reasons: (a) the fact that inline styles/JS makes it hard to maintain a website and (b) that there’s a slight hit on the performance of primed caches vs. external JS/CSS, you’ll find my JS/CSS in external files.

@ROW: I’ve been working really hard on coming up with solutions that will make the user experience on Six Revisions an enjoyable and easy one. I’m specifically working on better findability and a better way to hold discussions in the comments section. I want Six Revisions to be a two-way street, where people discuss and reply to each other, as well as keep track of ongoing discussions.

I’ve considered the “subscribe to comments” feature for a while now; the problem I see with existing solutions is that it barrages your inbox with a ton of emails (you get an email per comment made). Therefore the solution I’m envisioning is one where you can selectively keep track of certain conversations and limit yourself to that particular conversation only. So what needs to happen is a complete restructuring of the comments section to a more “forum-style” section. Parent comments will give you the option to subscribe to them for replies. You can check off as many parent comments as you want to subscribe to, or just subscribe to every comment update.

As I see it, this is something I’ll have to develop myself.

In short, I welcome any suggestions on how I can make Six Revisions a more “conversation-friendly” place to hang out in. Feel free to send me your responses via email: jacob sixrevisions.com

Stuart Colville

January 20th, 2009

Hi Jacob, the CSS Sprite Generator is here: http://spritegen.website-performance.org/ assuming that’s the one you’re talking about.

stamper chik

January 23rd, 2009

re: #2, the program used for compression may also affect the size and quality of the file.

I’ve used photoshop and ai in many permutations to create a small, good-looking .gif, but haven’t found anything that works as well as Macromedia Fireworks (now Adobe Fireworks), export settings Optimize > GIF WebSnap 128 > Web Adaptive .gif.

Was able to cut down graphics files to 25% of what they were with an indistinguishable quality difference (except perhaps to some oldschool win95 users at garbage resolutions)

Jyoti

March 1st, 2009

thanks for the info on the scripts.

Albert Peterson

April 14th, 2009

thank you for the information.

Yogi

May 28th, 2009

it’s good. thanks for posting.

jay

June 24th, 2009

it’s good. thanks a lot and keep it on.

Daniel

December 24th, 2009

Hi Jacob, great information.

See also Tips for Improving Website Performance

Thanks

verizon saga

April 19th, 2010

its a great tips to increase perfomance web..

google also make peformace on rangking web in serp

Craig

September 9th, 2010

I definitely need to try out the sprite generator to make reductions, Thanks for all the tips!

Düzce

September 24th, 2010

it’s good. thanks a lot and keep it on

kaifi Ansari

January 6th, 2011

thans for giving such kind of knowledge ..

Vinit Pratap Singh

March 28th, 2011

It’s works.. thanks

Ajit

March 28th, 2011

Very nice article.

William

April 1st, 2011

Hey, I like the “css sprites” stuff. Never heard of that, although I have used css positioning of a background image for roll over effects and css sprites looks like a variation of that, but the idea of combining various little images into a single image and then just showing parts here and there to reduce http requests….cool, cool, cool!

John

June 28th, 2011

Hi, everyone thaks for information and that’s nice website

Diogo Melo

July 20th, 2011

Sometimes you reference objects on css or js files. You can “pre-load” those objects in order to reduce the time needed to render your page. It’s better explained here: http://diogomelo.net/blog/11/improve-page-load-time-performance-pre-loading-images

Alex Silva

October 5th, 2011

The information contained here is very interesting. I am newbie in SEO. I can try some of your tips but the others seem to be only for pros. I don’t anything about code

Leave a Comment

Subscribe to the comments on this article.