google.load(): Utilize Google’s AJAX Libraries API

By  on  

The problem: loads of websites around the internet using the exact same JavaScript file. The file is a whopping 100KB in size. Since this same file resides on each website's server, the file is downloaded and cached for each individual website. Lots of load time for same file.

The answer: Google AJAX Libraries APIs. Google hosts these frequently used files, including the newest versions (and legacy versions) of jQuery MooTools, YUI, Dojo, Prototype, and more. Why use Google's AJAX Libraries API? Benefits include:

  • Google's servers can serve the file faster than your shared hosting server.
  • Since the file is always being pulled from the same place, the more sites that use that file, the more likely that file is already in the user's cache. Thus, your website loads faster.
  • You save bandwidth.

Here's how you implement google.load().

The JavaScript

	
	
	
	//get the latest moo
	google.load('mootools', '1.2.1');
	
	//other examples
	google.load('jquery', '1.3.1');
	google.load('jqueryui', '1.5.3');
	google.load('prototype', '1.6.0.3');
	google.load('scriptaculous', '1.8.2');
	google.load('mootools', '1.2.1');
	google.load('dojo', '1.2.3');
	google.load('swfobject', '2.1');
	google.load('yui', '2.6.0');

That's all there is to it. Pass Google the library and version you desire and Google does the rest.

Recent Features

Incredible Demos

Discussion

  1. I’ve loaded in individual files from google before for this same reason, but I didn’t realise that they had an api to do all the work! I’ll start rolling this out from now!

  2. I’ve been doing this with jQuery for a while, purely because a screen cast an NetTuts showed me where to get it from.

    I didn’t know Google did these other JS library’s, but I’ll use them now!

  3. Aaron Dixon

    For MooTools, does this load both Core and More? I am guessing that it does, but I can’t find where it explicitly states it.

  4. Aaron Dixon

    I will follow up on my question. google.load('mootools','1.2.1'); only loads MooTools Core. You still need to reference More on your own. I am guessing that is how it should be since More contains mostly plug-in’y type stuff?

  5. @Aaron Dixon: Agreed.

  6. Nice! I have started using this as well. What do you prefer using google.load() or linking directory the the JS file?

  7. I actually prefer to link to the JS file, although I can’t give you a solid reason why.

  8. Talv

    when this was announced google didnt have one running over ssl meaning you’d get a “some parts of this page are unencrypted ” message in certain browsers any idea if this is fixed now or is it only for those of us who have projects running over http. and i too prefer the direct link!

  9. Well, as I love the feature that I can catch the libs from Google’s servers, I still wish they would let us upload our JS files and call them with google.load();

  10. Fabian Beiner

    I’m still not sure which choice is the better one: Either use the libraries from Google server or hosting them by myself and including them through some “shrink.php” script, which shrinks the file size by compressing it with gzip. (for example: shrink.php?files=mootools-1.2.1-core,mootools-1.2-more,default which is around 68KB in real gets compress to 20KB totally). What do you think?

  11. Sending them through a “shrink” functionality every time would not only fail in relieving all of the items above, but it would hurt response time as it would put more stress on the server. Avoid that at all costs.

  12. @Brenelz and @david: i have debated both options myself, should I use google.load() or just link straight to the file? and I have come to realise that it is better to just link straight to the file. The main reason is that if you use google.load() you need to include the google api javascript (1 connection), then run a function to load another file (another connection) and then run your scripts. Thats more or less the process, but this means that you will have a delay on your ‘domready’ functions.

  13. Great points Anton!

  14. @david and @Fabian I’m not sure what Fabian is using but the shrink ‘service’ (it’s really more of a php script) I’m using caches the combined files as well as minifying them. As a result yes the first request puts a little extra strain on the server but everything afterwards is serving just one file from disk like any other include. Obviously the disadvantage (in context) is the lack of Google’s outstanding CDN.

    @Anton the primary reason to use the API is if you want the latest version by calling google.load(‘mootools’) regardless of what version it is. If you’re specifying a version in the load function it does seem silly though I agree. Also the generic plus one about having to include google’s api script.

    Also a couple side notes.

    Your underwear is showing – give a body background to put your pants on (underwear trick presented by Jeffery Zeldman)
    Your author comments appear to be missing the little talk bubble and I can’t tell if it is intentional or not but it looks like not.
    Your ‘latest tweet’ thing in the footer has a cursor: pointer even though there is no tweet to link to (server currently down I guess)

  15. One last thing I guess, your preview box isn’t quite accurate because ordered lists are getting butchered out. :)

  16. Hartmut

    Hi David,

    so why don’t you use this technique?!

    Greets
    Hartmut

  17. Tom

    I’ve been thinking of using this before, but haven’t implemented it yet. One of the reasons is that you have to load the Mootools-more from your own server anyhow. So my technique to date has been to join both “core” and “more” to one file and minify it. This works OK, as that file doesn’t really ever change.

    But, I would start using google.load (or rather link directly the the files on their servers) if it would be possible to load the “more”-package as well. Seeing how jQuery UI and script.aculo.us being in the repository so should Mootools-more.js also be.

  18. V1

    Yes its a nice technique, but I still prefer to host the files my self, it gives me control about my files, if there is something wrong, its MY own fault. I had allot of problems with google analytic (It took long time to LOAD the files more than one second, and I cant accept that behavior on my sites especially if you depend on these files.)

    Also another reason why i dislike this technique, it will cost you one extra HTTP request for the google that will load the file for you.. And an extra DNS lookup because you will link to google. So there are some side effects from using this technique. If you do not care about this.. Yeah sure, go ahead and use it.. But its not that hard to apply cache headers your self..

  19. @Bryan J Swift: I do agree, it is a cool technique to get the latest version of the framework. However, I prefer to keep control of the framework versions just in case anything big changes.

  20. We have begun using a script called Combine that we rolled into our framework. It combines all the files and gzips them, stores that file on your server. Then it uses some nice caching that caches the file locally as well and forces a new version if you’ve changed anything. I also added in a whitespace stripping function but it didn’t save much over much since gzip does a lot of that itself.

    Here is a post I did on it a while back:
    http://blueprint.intereactive.net/compressing-javascript-part-1/

    still yet to get all the post written that shows the benchmarks.

  21. I think, that this solution is good, if you have restrictive data transfer on your site (shared hosting, …), but as @Anton says, i prefer having theese files at my own control.

    It doesn’t mean that I’m afraid of Google, but there is no warranty that another day, there will be some other content than you expected :) In case of hosting these file on your host, it is minimal chance to scratch that file transfer.

    And about the size and caching: there is JSmin, YUI compressor and so on, so at the case of MooTools it’s about 100KB for fullstack (core and more) code minified by YUI compressor.

    I applaud Google for this initiative, and I’m sure that there are a lot of people who will use this, but I didn’t see some pros for this solution.

  22. cloud computing is the feature, like mentioned in the first by David – the same file is served for everyone ! and if u wondering if the old versions remain on servers, the answer is YES ! so u dont need to manual update the direct link to the lib.

  23. Sending them through a “shrink” functionality every time would not only fail in relieving all of the items above, but it would hurt response time as it would put more stress on the server. Avoid that at all costs.

    That’s true, but not if you implement a caching solution…then it’ll be much better. Ed Elliot has a nice post about this topic, and I use a solution based on some of his ideas, which works really great.

  24. @V1:

    If you read the google pagespeed docs (http://code.google.com/speed/page-speed/docs/rtt.html) about parallelizing downloads you might actually be saving a little time. Plus the Google server’s IP will likely be cached by any ISP and even better, Google’s hosted script will likely be cached by the browser more often than yours will, so the DNS lookup and extra request most likely won’t actually happen.

  25. Thomas

    Wouldnt it be better to just use google’s Closure Compiler(http://code.google.com/closure/compiler/) ?

  26. I recently started using google.load and for jquery and jquery ui. I think it is cool that they offer this.

Wrap your code in <pre class="{language}"></pre> tags, link to a GitHub gist, JSFiddle fiddle, or CodePen pen to embed!