There are lots of excellent tools to help web developers optimise their websites.

Here are two simple things you have no excuse for overlooking on your next project.

HTML, XML, Javascript and CSS files

One of the easiest ways to speed up a website (often to a surprising degree) is to turn on compression of plaintext content through facilities like mod_deflate or the Gzip Module.

Here's the Apache configuration file I normally use:

AddOutputFilterByType DEFLATE text/html text/plain text/css text/javascript text/xml application/x-javascript application/javascript  
BrowserMatch ^Mozilla/4 gzip-only-text/html  
BrowserMatch "MSIE 6" no-gzip dont-vary  
BrowserMatch ^Mozilla/4\.0[678] no-gzip  

Images

As far as images go, the following tools will reduce file sizes through lossless compression (i.e. with no visual changes at all):

  • gifsicle -O2 -b image.gif
  • jpegoptim -p --strip-all image.jpg
  • optipng -o7 -q image.png

(An alternative to optipng is pngcrush.)

Note that the --strip-all argument to jpegoptim will remove any EXIF/comments tags that may be present.

I've wondered before -- why not make this work the other way around, and keep compressed images on the website's hard drive, and expand them only when a browser can't handle it?

I'm sure this just shows my lack of intelligence in this area, though...

Comment by jimcooncat

Try jpegtran instead jpegoptim:

jpegtran -progressive -optimize -copy none orig.jpg > jpegtran.jpg

Comment by paulox

jpegtran++

$ sudo aptitude install libjpeg-progs

Comment by don
Another item to check (Firebug or Web Developer Extension are useful) is the HTTP Expires: header. Setting a long "Expires" helps minimize traffic from repeat visits: more info
Comment by Don Marti