This post is over 6 months old. Some details, especially technical, may have changed.

Client Side: Processing vs Responsiveness

The examples below are probably best run in a browser that has a relatively slow script engine (IE for example)

One of the problem with the whole Web2.0/Rich Client concepts is that people seem to think it's OK to shove processing that should be done on the server on to the client. This is never the answer - you can control resources on the sever but never on some random browser half way across the world/country/street/room. There are, however, situations where client side processing is going to be mandatory (think Complex Grids and Calculations etc). This obviously leads to a loss of responsiveness on the client (to varying degrees). One of the big problem with large processing is that it obviously takes time. Imagine this scenario...

function doProcessing(){

  /* get the progress indicator element */
  var el = document.getElementById("progress-meter");

  /* update element */
  el.innerHTML = "Processing";

  /* do long running processing */
  for(var i = 0 i < 999999; i++){
    var j = Math.sqrt(i);
  }

  /* update element */
  el.innerHTML = "Done!";
}

doProcessing()

You'd expect that this code updates the "progress-meter" element with "Processing", does some long computation and then updates the "progress-meter" again to indicate it's done. But it doesn't. Screen repaints/updates do not happen until calls complete. So what is actually happening? Well the expected outcome DOES happen it's just both updates happen after the call completes so the first update occurs so fast you never see it. See It In Action. You'll also notice that the entire screen is locked while this processing occurs and that just leads to a bad user experience. So what can be done?

Use of setTimeout()

Javascript is single threaded which leads to the issues above but there is a way create pseudo-threading using setTimeout. The setTimeout function executes code after a set time in milliseconds (what a surprise!). So delaying this execution (even by 0 miilliseconds) allows screen updates to occur before the processing begins. This requires a small change to the code.

function doProcessing(){
  var el = document.getElementById("progress-meter");
  el.innerHTML = "Processing";
  setTimeout(function(){
    for(var i = 0; i < 9999999; i++){
      var j = Math.round(Math.sqrt(i));
    }
    el.innerHTML = "Done!";
  },0);
}

doProcessing()

So if you give this a go you will see there is a much better feedback response compared to the first. This approach solves our initial problem but we still have the issue that the screen is locking up. At least now the users knows something is happening. Another issue this creates is that some browsers prompt users if scripts take to long allowing the user to cancel potentially important tasks.

[[posterous-content:jcdDmwFBuEneeneritwd]]

Asynchronous Processing

Now we are getting into the good stuff. To prevent the screen locking up during any complex processing activity it is possible to apply the setTimeout technique but only over a smaller subset of the processing activity each time. This will achieve a sort of asynchronous effect allowing the user to do other stuff while the processing happens in the background. This is a basic example,

function complete(){
  alert("Processing Complete");
}

function doProcessing(callback){

  var el = document.getElementById("progress-meter");
  el.innerHTML = "Processing";

  /* setup iteration */
  var iterations = 9999999;
  var chunks = iterations/100;  /* each chunk is 1% of the overal processing count*/
  var i=0;

  /* self executing anonymous function */
  (function() {

    /* process chunk */
    for(var count = 0;i<iterations;i++) {
      var j = Math.round(Math.sqrt(i));
      count++;
      if(count == 99999)
        break;
    }

    /* more to process */
    if(i<iterations){
      /* update screen */
      el.innerHTML = Math.round((i/iterations)*100) + '% Complete';

      /* recurse for next segment */
      setTimeout(arguments.callee,0);
    }else{
      /* update screen */
      el.innerHTML = "Done!";

      /* call optional callback */
      if(callback){ callback() }
    }
  })();
}

doProcessing(complete);

You can see this in action for yourself. If you run the example you will see that there is a constant update (after each chunk is processed) and that screen control is still available. So this has solved both our issues but has introduced new considerations.

  • Executes asynchronously therefore we need to support a callback mechanism as it's not procedural (see code)
  • Processing time increases (see below)

You will see that the length of time it take to process the complete task is a lot longer than the other 2 examples.. this is due to the increase in complexity of the code (recursion, another function call etc) and this is the trade off. Processing time is inversely proportional to the size of each processing chunk

[[posterous-content:ImAAruEoruBBGBGeBetI]]

Chunk Size (y) vs Response Time (x)


As the chunk sizes get smaller you get a more responsive UI but the processing time can get scarily long. The example chunks at about 1% of the total task size and that seems to be a good balance (for this task at least) between responsiveness and processing time. I threw together a little suite of yielding techniques with various update times etc and you can play around with that here

Published in JavaScript on October 18, 2010