Speeding Up My Website

By  on  

One of the biggest goals with my website redesign was to update something no one sees -- website speed. Sure I'm using different colors, fonts, JavaScript techniques, and images but I'm most proud of the the speed increases I made. Let me explain the ways I've improved my website performance -- maybe you can learn something from what I implemented.

Image Optimization - PNG Crush

Unlike varying content pages, Images are a fixed download size so I sought out to optimize image sizes. I used PNGCrush to compress images without loss of image quality. Using PNGCrush I was able to knock off 120KB of useless image space.

Plugin Removal

I wasn't happy about doing it but I removed the WP-Polls plugin. Removing this plugin allowed me to avoid loading both jQuery and MooTools on every page. Removing WP-Polls also allowed me to avoid a few server requests (to CSS and JS files). I saved approximately 75KB by doing this alone.

Javascript Compression and Consolidation

I've placed all of my JavaScript into one file. MooTools More classes and custom plugins were all compressed and jammed into one file. This allows for less requests to the server. About 40KB was saved using compression.

Google AJAX API

Instead of bogging my virtual server down by serving up the same MooTools Core JavaScript file, I've started letting Google carry that burden using their AJAX Libraries API library. What's great about using their API is that if you've been to other websites that use Google's AJAX API, the file is already cached and my site loads faster.

.htaccess ETags and Compression

I took full advantage of Eric Wendelin's guest post, Improve Your YSlow Grade Using .htaccess, and used .htaccess to add Expires headers and file compression strategies to decrease load time and implement file caching.

PHP Caching

You'll notice that my new design displays my RSS subscriber count, Twitter follower count, and latest tweet. I've added caching strategies to my website to make the RSS subscriber count and Twitter follower count check once per day and my latest tweet check once per hour. Implementing these caching strategies saves me remote requests on every page load.

Database Query Caching

I've implemented WordPress' DB Cache plugin which caches query results and is proven more effective than WP-Cache and WP-SuperCache. I was having a lot of problems with WP-Cache so this plugin was a godsend for me -- the site feel faster and I don't run into problems with other caching methods.

Room for Improvement

My website is not yet perfect -- there are a few things I can improved:

  • CSS Sprites -- Since I've just launched the redesign, I didn't want to go into spriting yet in case I need to overhaul things. Sprites will save me quite a few server requests.
  • Media Temple Issues -- Unfortunately I get a "Cannot Establish Connection" every once in a while so I need to check out what's causing that. I never got those warnings on Dreamhost hosting.
  • BuySellAds -- I need to twists BSA's arm into Gzipping their resources to speed up downloading of those files. BSA implementing GZipping would help out thousands of websites.

Have any other ideas for optimization? Let me know!

Recent Features

  • By
    How to Create a RetroPie on Raspberry Pi – Graphical Guide

    Today we get to play amazing games on our super powered game consoles, PCs, VR headsets, and even mobile devices.  While I enjoy playing new games these days, I do long for the retro gaming systems I had when I was a kid: the original Nintendo...

  • By
    Vibration API

    Many of the new APIs provided to us by browser vendors are more targeted toward the mobile user than the desktop user.  One of those simple APIs the Vibration API.  The Vibration API allows developers to direct the device, using JavaScript, to vibrate in...

Incredible Demos

  • By
    Face Detection with jQuery

    I've always been intrigued by recognition software because I cannot imagine the logic that goes into all of the algorithms. Whether it's voice, face, or other types of detection, people look and sound so different, pictures are shot differently, and from different angles, I...

  • By
    Table Cell and Position Absolute

    If you follow me on Twitter, you saw me rage about trying to make position: absolute work within a TD element or display: table-cell element.  Chrome?  Check.  Internet Explorer?  Check.  Firefox?  Ugh, FML.  I tinkered in the console...and cussed.  I did some researched...and I...

Discussion

  1. Jeremy

    Hi David,
    It’s not about optimization but details actually…because when you click on a link or a button (to share on social network for example) there’s the outline appearing. And to me it looks truly better without it.

    In all my works I remove it with the line ” * {outline:none;} ” on the css.

    But maybe you don’t do that because I suppose it’s not recommanded because of lack of accessibility..

    Anyway your blog is just great, thanks for sharing your knowledge ! :)

  2. @Jeremy: Ummmmmm, thanks for the suggestion but removing the outline is an accessibility issue so I wont do it.

  3. Darkimmortal

    Memcache, Nginx+FastCGI, InnoDB and tweak your my.ini.

  4. Jeremy

    sure, you’re right to improve accessibility.

    By the way, for optimization, what do you think about removing line breaks on the css ?

  5. I am surprise of your result with pngcrush. I usually see a difference of 3 or 4%. Not enough for the trouble…

  6. @Cedric Dugas: PNGCrush was an absolute beast.

  7. Everyone: Daniel Tamas sent me a link to an outstanding little image optimization app that will crush your images for you:

    http://pornel.net/imageoptim/en

  8. I’ve had far better png compression results from punypng in comparison to smush.it and pngcrush, well worth a look: http://www.gracepointafterfive.com/punypng

  9. You could probably hack out several of the ‘social media’ badges by creating your own sever-side solution.

    These might be the easiest ones to start with:

    /links/scripts/prototype.js.h-1594678674.pack 12 KB (53 KB uncompressed)
    /links/combined.js.h1853910243.pack 11 KB (37 KB uncompressed)
    /links/dwr/interface/LinkManager.js 3 KB
    /links/widgets/zoneit.js 818 bytes
    /feeds/json/url/data?url=http://davidwalsh.name/speeding-website&callback=delic 182 bytes

    (I used FF webdev toolbar to get this)

    You might also throw some of those small icons in a sprite (digg, delicious scriptstyle etc)

    I’m mostly a designer so that’s the first thing I noticed when I saw your redesign. I think you did an excellent job with the branding. The site still retains a lot of the ‘feel’ your last version had but with a fresh coat of paint.

  10. Cerium

    Jut some ideas to speed up again the website :

    Compress CSS (like JS)
    Put the script tags at the end of the body (in fact JS is not the most important thing -even on a blog which tell a lot about it, content is more important-) so the scripts will be downloaded after the content, so the content will be loaded faster.
    Remove comment from the HTML (I think about the ASCII one, and rdf one). Not a big optimisation but it’s represents some bytes !

  11. I also implimented google’s AJAX api hosting recently accross all my sites…saves quite a bit of bandwidth…

    Thanks for the ETags link, I’ve always wanted to know how to do it…

  12. Darkimmortal

    One quarter of the page load speed of about 4.5s is the HTML (1.1s).

    Switch to/implement the changes I mentioned above and you should be able to take that down to 200ms (mostly opcode cache overhead and MySQL connection and query time).

    You’re barking up the wrong tree with the script and image changes – nothing short of removing a lot of them or cutting down on requests will make a significant difference. Download times (sizes) are not the issue here – it is the request overhead, for which your webserver (consider moving to Nginx+FastCGI) and the fact that quite a few cookies are being sent for each one are to blame. You should move your resources to another domain and which has no cookies set on it.

  13. As suggested Memchace. Or use an op-code cache such as APC or XCache. Then once that is in place you can cache your DB results for things like comments or blog lists. I “think” Xcache has a WP plugin that hooks into the major areas, however we use APC on our server and in our framework. It’s amazing how much load you can save when you aren’t hitting the DB everytime the page loads.

  14. @Ryan: Here is some more reading on APC’s functions for storing variables (or in this case a DB object containing the result set)

    http://us.php.net/manual/en/function.apc-store.php

  15. PHPExpress has been tested to be faster than XCache, eAccelerator and APC:

    http://forum.nusphere.com/viewtopic.php?t=4453

  16. Thanks for the mention of WordPress’ DB Cache. I’ve been thinking of implementing caching in my blog eventually, so I’ll give this one a look.

  17. @Darkimmortal: I’d have to see some more tests run. And based on that it is only .02 faster than APC with the zend extenxion. And moreover there are no other details as to what the machine specs or ini settings are. These days most of the opcode cache’s are pretty much the same… I just prefer to run the one that Rasmus himself works on.

    and nowhere do I see where you can store runtime variable in the cache which was the point of my comment and is where you will see other speedups in your aps. the comment was not to run an opcode cache.. (because everyone should)but to use one that allows you to store objects and variables in cache.

  18. I noticed my blog slowing down recently too. I wasn’t so daring as to compress the javascript. So instead I compressed my theme’s style.css. It was 13kb to 8kb, I believe.

    WordPress should have some mechinism for automatically compressing and caching css and javascript.

  19. I think Google AJAX API (or any other library hosters) has its advantages and disadvantages. If you don’t update your local scripts according to the new versions of the library, your site may malfunction when the hoster replaces the older version of the library with the new one.

    It’s not a problem for a site like this. But if you are working on a client project then you probably would like to host the scripts within the server where the client’s application is hosted.

  20. @Ryan Rampersad: I’d be shocked if that made much difference. WordPress should definitely have a mechanism to compress and consolidate JS and CSS though.

    @Can Berkol: Agreed.

  21. @Nick Pack: +1 to punypng. I get about a 15% improvement from PNGCrush (!)

    @Walsh: You can certainly use PHP to combine and compress all the JS and CSS, but I find it causes more trouble than it’s worth.

  22. Hello David,
    I’m just in the middle of doing the same chores as you mentioned in this post.
    for consolidating java script and css files I’ve just started testing this plugin:
    Plugin Name: PHP Speedy WP
    Plugin URI: http://aciddrop.com

    It caches the resulting consolidated files and offers some decent options, such as exclusion.

  23. I use the wordpress plugins “Autoptimize” and “cSprites for WordPress” on my blog and they both have worked well. Thanks for the tips.

  24. @Eric

    Curious to know what troubles you’ve seen when compressing and delivering compressed versions of CSS and JS via PHP. We’ve been using a modified version of the script called Combine (http://rakaz.nl/code/combine) for a while across a number users and browsers with no issues.

    It basically grabs all your JS files, combines them, Gzips it and caches the combined file on the server until the script sees a change in the files then it refreshes the cache. Works like a champ.

    I think the issue with a WP solution is that all plugins hook their CSS and JS differently.

  25. @Ryan: That’s one of the things. The other is that I try to maximize the use of client-side caching and the PHP solution prevents that in almost all cases. (Though if someone has a solution for that it’d rock)

  26. Messing with CSS, JS and images is only going to help dial-up users tbh.

  27. @Eric
    that combine Script makes use of Etag headers to use local cached versions if nothing has changed. We had to tweak it and add an Expires: 0 header to get it to work properly on our server. All modern browsers (including IE6) support Etags and when used right then they work perfectly. There is slight overhead in check the server if anything has changed but it’s very minimal especially if you don’t have the overhead of frameworks, etc.. If nothing has changed it sends back a quick 304 response and the browser uses the cached version. We’re talking small ms times.

    And sure… using local cached versions with a 2 day expire time will be faster. but then you have to deal with how to change everything when you modify your CSS file slightly. Etag headers resolve all that and for the life of me I can’t figure out why they aren’t used more.

    @Darkimmortal
    suuuuuuuure it does….

  28. @David Walsh: Thaya Kareeson (Omninoggin.com) has a WP plugin that can combine/compress JavaScript.

    “I need to twists BSA’s arm into Gzipping their resources to speed up downloading of those files. BSA implementing GZipping would help out thousands of websites.”

    Please do so. It does seem to take awhile to load sometimes. Gzip’ing it would be a big help. :)

  29. @Ryan

    I’m talking from real-world experience. As long as your resources are under 300kb or so and are cacheable, then the server, cache and database are what you should be looking at. In the case of this blog especially with > 1.2s going to the actual page, which could be cut down to under 500ms.

    Stuff like GZipping, CSS sprites, concatenated and minified scripts are still useful though in extreme circumstances or if you target users with slow connections.

  30. Updating a design is not a reason to enhance your website speed. What is the original reason ? Bandwidth cost ? Loading time ? Just for fun ? Get an eco badge ? Please tell us more in the introduction.

    You should mention earlier in your post that you’re talking about WordPress and in a case of a personal blog. PHP is great, but for maximum performances, again for a blog, from my point of view there are no secrets: buy a personal host, use a CMS which publishes static html files, use only dynamic languages for specific stuffs (ex: contact form data ), and work deeply with a team of engineers to customize the whole set.

    I think WordPress won’t provide js or css compress tools. Not because they don’t want, because it’s just a market / business reason. It’s a kind of blog for ‘7 to 77 years old’. If tomorrow wordpress requires 100 items, at least gzip, curl, imagemagick or some extras stuffs, i don’t think it will be welcomed. WordPress is using addons, so the community answer will probably be an addon for servers which support gzip for example.

    If you like Yahoo! stuffs, you should take a look at “Sprite Me” from Steve Souders – http://spriteme.org. This google tech talk: http://www.youtube.com/watch?v=pNfRL-TwzZY introduces Browserscope & SpriteMe. You will like it.

    I read you on *Nuts sites but there is a lack of technical reviewers. Public are from beginners to expert. Please ask your truely colleagues / friends to review your articles. Shure next time you will update lots of stuffs you didn’t think about. If you already do it, well keep it going :)

    Last feedback: most of the time you like linking internaly to your post. Why ? SEO ? Please provide clearly links to the original website’s tool (pngcrush, etc…) and additional links to any tutorials are welcomed too as well as a few lines about the license please: you’re talking about open source. It seems your reputation is growing on internet, so keep your blog the smart way :D Nobody said evangelizing in a public place is easy.

    Cheers and thanks ! ! !

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!