Queue events that occur before JavaScript is loaded

One of the common recommendations for speeding up your website is to put your JavaScript at the bottom of your page instead of including it inside the head tag. The difference this simple placement can have is impressive, especially if you are dealing with sizable JavaScript libraries that are usually 50k at best.

One downside with putting your JavaScript at the bottom is that your fast clicking visitors may click on links that won’t work. The reason this happens is because the JavaScript that those links trigger hasn’t been downloaded yet. Usually those links will work on the second or third try, but it makes for a bad user experience and a poor first impression.

I decided to fix it by queuing up those user-triggered actions and replaying them as soon as the document is ready. I wrote a wrapper that I can use anytime I have code that depends on my JavaScript being downloaded and available.

The concept is simple. Instead of calling functions directly when a user triggers an action, I add the function call to a queue. When the document is ready, I loop through the queue and execute each of the actions in the order that they occurred. I put this code inline inside my head tag so it is available as soon possible. The rest of my JavaScript can then be included right before the closing body tag without worrying about this race condition between the browser and website visitor.

<script type="text/javascript">

var loaded = false;
var action_queue = new Array();

function when_ready(callback) {
    // skip the queue if the document has already loaded
    if (loaded == true)
        eval(callback);
    else {
        action_queue.push(callback);
    }
}

function dequeue_actions() {
    for (i in action_queue) {
        eval(action_queue[i]);
        delete(action_queue[i]); // cleanup after ourselves
    }
    loaded = true;
}

</script>

I then trigger dequeue_actions() as soon as the document is ready:

// using jQuery
$(document).ready(function(){
    dequeue_actions();
});

// this works too
onload = dequeue_actions;

You can then safely make function calls using when_ready(). For example:

<a onclick="select('foo')">foo</a>

becomes

<a onclick="when_ready('select(\'foo\')')">foo</a>

In my testing, the results have been very smooth with delays being almost unnoticeable. Of course, your experience will vary depending on the size of your document and how long it takes for your document to be ready.

This code is pure JavaScript and should work in every modern browser. I’ve tested it in IE6+, FF2+ and Safari 3+.

  • Interesting technique. I've not seen this queuing method before, but I have seen another method of dealing w/ this problem that is very similar. Basically, the idea is that when you load the javascript links initially you give them dummy functions that queue the function call, but then replace the href once the javascript is loaded (and execute the queue). It avoids the wrapper but at a rather expensive replacement of the links. I think your idea of a wrapper function is a bit cleaner, but generally eval is pretty bad with large functions/bodies of code in my experience, so the other technique may work better if it's any more complicated than a straight function call.

    One thing you might consider to get a slight bit of extra performance is adding the dequeue_actions(); call to the end of the javascript you're loading (assuming you control the script and it's not loaded from a third party). This will allow the js to execute as soon as the necessary javascript is loaded instead of waiting for the onload event (which on a heavy page could be noticeably later).

    • I like your suggestion of calling dequeue_actions() sooner. It might get tricky if you are including multiple javascript files in a non-blocking fashion. If you're loading them via standard <script> tags it would be simple to add an inline <script>dequeue_actions()</script> right after your includes even if they're on different domains.

      A more complicated version could easily take an array of scripts to be lazy-loaded and then watch the onload/onreadystatechange for each of them & only call dequeue_actions() once they were all done. I found some code that would be a great start for anyone interested in taking that on: http://www.nczonline.net/blog/2009/06/23/loading-

      • Jon

        Yes this is true. My main concern with 3rd party scripts is that the end of the first script might not be the end of the necessary JS. Often remote scripts will then load multiple other objects/scripts that may be necessary for the code to function properly. And since it's a remote script, even if this isn't the case today, it could be tomorrow 😉 It's a safe bet if it's your own JS though that you know when the last bit of JS you need is and can insert it there. You can even include it as a conditional paramater in the script request if you don't want it to load all the time. Like:
        http://mydomain.com/js/script.js?callback=true

        Or something similar, which can then be setup to only include the function call when that parameter is set.

  • Chris G

    elegant, simple solution. for very slow connections, the action queuing may throw people off since nothing happens at first and then everything happens at once.

    another solution would be to have javascript links disabled and/or invisible at load time and then write a function that enables them all – but this might be slow for a page with lots of links. better direction would be to disable the css block(s) where the javascript stuff is. it could also display a loading graphic while things are disabled. this would be more seamless but slower for the impatient user.

    [reposted from facebook]

    • Thanks for re-posting this. That's a good point about people on slow connections — although I would argue they're like IE6 users, they're already pretty used to having a crappy experience on the web. 🙂

      When it comes to website performance, it's usually the perceived speed that matters. You can have something downloading for a minute behind the scenes and it's okay as long as the site still *feels* fast.

      What about just throwing up a loading indicator when something is added to the queue? That way you acknowledge the users' action but still keep the perception that the page loaded in 2 seconds… even if we're still a few seconds away from being truly done.

  • Have you tried using the global window.onerror to queue events instead of changing the function names? I guess it might not be easy to support in all browsers and you'd have to have some good regex-fu going.

    • Is there a way to track it back to know exactly which function caused the error? It feels like it would be really hard to isolate errors from JS not being loaded from anything else that could happen.

      • I think it would be really hard. I've just been coding in php and was trying to think of some way to make it work like __call, __get, __set, etc. I'll have to try it out and see what I can come up with.