Friday, July 1, 2011

Writing useful JavaScript applications in less than half the size of jQuery

Not too long ago, I tried to bring attention to how little of the jQuery library many developers actually use and argue that frontend developers should consider what sort of improvements their users would see if they could compile their code with the Advanced mode of the Closure Compiler. Today I would like to further that argument by taking a look at TargetAlert, my browser extension that I re-released this week for Chrome.

TargetAlert is built using Closure and plovr using the template I created for developing a Chrome extension. The packaged version of the extension includes three JavaScript files:

Name Size (bytes) Description
targetalert.js 19475 content script that runs on every page
options.js 19569 logic for the TargetAlert options page
targetalert.js 3590 background page that channels information from the options to the content script
Total 42634  

By comparison, the minified version of jQuery 1.6 is 91342 bytes, which is more than twice the size of the code for TargetAlert. (The gzipped sizes are 14488 vs. 31953, so the relative sizes are the same, even when gzipped.)

And to put things in perspective, here is the set of goog.require() statements that appear in TargetAlert code, which reflects the extent of its dependencies:
goog.require('goog.array');
goog.require('goog.dispose');
goog.require('goog.dom');
goog.require('goog.dom.NodeIterator');
goog.require('goog.dom.NodeType');
goog.require('goog.dom.TagName');
goog.require('goog.events');
goog.require('goog.events.EventType');
goog.require('goog.string');
goog.require('goog.ui.Component');
goog.require('goog.Uri');
goog.require('soy');
I include this to demonstrate that there was no effort to re-implement parts of the Closure Library in order to save bytes. On the contrary, one of the primary reasons to use Closure is that you can write code in a natural, more readable way (which may be slightly more verbose), and make the Compiler responsible for minification. Although competitions like JS1K are fun and I'm amazed to see how small JS can get when it is hand-optimized, even the first winner of JS1K, Marijn Haverbeke, admits, "In terms of productivity, this is an awful way of coding."

When deploying a packaged browser extension, there are no caching benefits to consider: if your extension includes a copy of jQuery, then it adds an extra 30K to the user's download, even when gzipped. To avoid this, your extension could reference https://ajax.googleapis.com/ajax/libs/jquery/1.6.1/jquery.min.js (or equivalent) from its code, but then it may not work when offline. Bear in mind that in some parts of the world (including the US! think about data plans for tablets), users have quotas for how much data they download, so you're helping them save a little money if you can package your resources more efficiently.

Further, if your browser extension has a content script that runs on every page, keeping your JS small reduces the amount of code that will be executed on every page load, minimizing the impact of your extension on the user's browsing experience. As users may have many extensions installed, if everyone starts including an extra 30K of JS, then this additional tax can really start to add up! Maybe it's time you gave Closure a good look, if you haven't already.

Want to learn more about Closure? Pick up a copy of my new book, Closure: The Definitive Guide (O'Reilly), and learn how to build sophisticated web applications like Gmail and Google Maps!

7 comments:

  1. Man, I really dig the intent of your post, but Closure is just so painful to code in. The beauty of jQuery is in its simplicity and eloquence, without sacrificing power. Closure feels so much like Java, YUI and other unnecessarily verbose APIs.

    Particularly since this is an extension -- making caching irrelevant -- did you ever consider using jQuery, but compiling the entire app w/ Closure Compiler to reduce the amount of jQuery that gets included?

    ReplyDelete
  2. Devil's advocate: I wonder what percentage of machines have jQuery in their browser cache. JQuery *is* needlessly huge, and lots of it could be compiled away, but then I'd have to redownload all the common internals that every page uses.

    ReplyDelete
  3. @Jason: I'm guessing the cache is irrelevant for most apps, unless they all share the exact same version, from the *same URL*. Even then you have parse time to consider, which on mobile can add up quickly (easily a few hundred ms in this case).

    ReplyDelete
  4. I'm wondering why Chrome or other browsers don't support jQuery so that whenever there is a script loading jQuery or jQuery UI (and it's css templates) from:

    http://code.google.com/apis/libraries/devguide.html#jquery

    It will just use the browser's local files.

    This way, all this Closure discussions can be avoided.

    ReplyDelete
  5. Akishore: You cannot just "compile jQuery in Advanced mode" as jQuery is implemented today. Two things that I am considering are (1) creating an alternative implementation of jQuery that is Closure Compiler friendly, or (2) creating an option for Coffeescript so it generates Closure-style JavaScript that can be compiled in Advanced mode. I'm much more interested in the latter, but would consider taking on the former as contracting work.

    ReplyDelete
  6. Felix: The focus should not be on jQuery, but on how browsers manage resources, in general. For the Google-hosted JS libraries, they should be served with the correct cache headers so that things basically work as you describe (when the browser sees the URL, it retrieves the file from the cache and never goes to the server).

    However, there are two problems with this solution. The first is that some sites do not want to depend on Google as a CDN and would rather use their own. Unfortunately, I don't believe there's a way in HTML/JS to say, "Here is a list of URLs to a specific version jQuery: just load the first one that you have in your cache. If none are in cache, then just fetch the first one in the list." If such an API existed, you could easily use Google as your primary CDN but still use your own CDN as a backup.

    The second problem is that you are ignoring parse/execution time which increases the size of the library. Even if you can always pull jQuery out of the cache when you need it, it still has to be parsed (though the parse tree could also be cached) and executed for each page that uses it. Therefore, as you increase the size of the library, you take more of the user's time.

    ReplyDelete
  7. Jason/joel: Here is some good evidence that many users should have some version of jQuery (that is hosted by Google) cached in their browser:

    http://www.naveen.com.au/javascript/jquery/reason-why-i-let-google-host-jquery-for-me/486

    However, as the author notes, sites do use different versions of jQuery, which means they fetch different URLs, which eliminates any benefit from caching.

    As I noted in my response to Felix, it's also important to consider the additional execution time that results from loading a larger JS library, even if it is cached.

    ReplyDelete