The other day I was in the local coffee shop dealing to my addiction. I couldn’t help but notice that the guys behind the counter were speaking in a foreign language. We’ve all heard this language: “Double cap out, no foam”, “Chai mocha skinny half with space”, “Moon unit alpha, tennis elbow”. It sounds like some kind of indecipherable language – the secret code of the Illumilatté, if you will.
I realise, of course, that these contractions are inevitable when you’re performing a task over and over again. To bring it a bit closer to home, they’re like keyboard shortcuts. But really, I’ve never seen any barista actually save any time by leaving off the “-puccino”. So what’s happening here?
The answer, of course, is perceived speed. It doesn’t actually have to save any time, as long as it gives the impression of saved time.
A good example is this: setting your microwave for one minute and eleven seconds actually takes less time than setting it for a minute flat – because you don’t have to move your finger from the 1 button. 1-1-1 is much easier to type than 1-0-0. Of course, the microwave will be running for a good eleven seconds more, but you aren’t sitting there watching it for eleven seconds – you’re off dicing and slicing, or whatever people do when they cook. I cook by phone.
A lot of this stuff has to do with efficiency of data structures. If you keep all your receipts, you put them in chronological order when you store them, right? Each one has to go into its alotted position. Would you design a data structure like that? A lot of expensive writes? You only ever pull all your receipts out of the box once a year, so that read action can afford to be slow compared with the hundreds of writes you might do. You should throw them into specific boxes (heh, sharding) and sort them in one go later on, or once a week or whatever. It’s like a cron job.(00 18 * * sun ./tidy_room.sh).
I was reading a great article the other day at Assert True which says that twice-as-fast isn’t good enough. A speed-up factor of 10 is what’s required to really make any difference. That really only applies to foreground actions – which are high in cognitive interaction. If something locks up my computer, then a 20-second lockup and a 40-second lockup are pretty similar. They both interrupt my computing experience, and generally tempt me to wander over to the coffee machine.
What I’m effectively doing here is backgrounding the process. It’s still happening, it still takes time, but it requires none of my cognitive space. I only notice something’s happening when I’m actively waiting for it – and in this instance, I’m not. So if the application backgrounds the process for me, it’s doing exactly the same thing. If I can use the rest of the computer while it does its thing, then my workflow is pretty much uninterrupted.
This is why background processing is such an awesome tool. If you have to do 10 non-urgent things when an action takes place, put them all in a queue and report back straight away. This is for more than just server-client requests, too. For example, when I was building Twiggy last week, I found that the jQuery remove() function was really slow at removing all the tweets from the results page. That meant that when a new search took place, the whole application would lock up while jQuery cleared the list. Empty() and .html(”) weren’t fast either. Nothing would make this bit of my application run more quickly.
What was fast, though, was adding a ‘hidden’ class to the tweets and hiding them. That was instant. So, the trick was to hide the old tweets and conduct the search. Now, when a search is conducted, the delay is huge. There’s at least a 3-second gap between clicking search and having the results available. The difference is, this dead time is expected. So I told jQuery to remove all old tweets during this searching process. It locked up the phone for the same amount of time, and probably added to the total search time. But when you’re waiting for the results to come back, you know you’re going to be waiting anyway. The really important thing is that hitting the “search” button has an immediate effect. The old results disappear, and the “searching” text comes up. Things are happening. Happy user!
In interaction science, we’re told that the human brain needs less than 50ms response time between an action and its perceived reaction. Any longer, and it feels as though the system is lagging behind. Symbian, buddy – that’s you. That’s why the ubiquitous AJAX loading gif was invented – it doesn’t actually tell you anything. There’s no indication of how long the action might take, but the main communication is there – something is happening. And that’s important. You’ve acknowledged my request straight away, and told me you’re working on it.
In truth, the difference between the tweet-clearing and the asynchronous searching can be huge. On a cellphone, where latency is really high, the searching might take 10 times as long as clearing out all the tweets. But that’s fine – the user knows this. What I can do, therefore, is hide all the other slow interactions inside this searching query, where they pale into insignificance. Better yet, I can spread the invisible functions out during the course of the application’s usage. If I remove a hidden result every half a second, it takes almost no time at all. It’s effectively a process queue.
So if your app does loads of stuff on a request, like emailing 10 people, or processing credit card stuff, or generally doing ANYTHING with the filesystem, run this process in the background. I use BackgroundJob in Rails for its simplicity, but there are others like BackgroundFu and Starling that have other feature-sets. Some can use multiple processes to run the jobs, and some are kept in memory. Pick the one that suits your jobs best.
Either way, remember to keep stuff as snappy as you can. This is an optimization, and premature optimization is the root of all evil – so remember to do this reactively once people are using your application. It’s really just intelligent optimization – instead of spending hours speeding up your code, just leave the stuff that doesn’t need to be done straight away until later, when it can be done in the background. Uploading to S3, rotating and cropping images – there’s loads of stuff. If you have a process that generates big ugly graphs, make it run once every six hours, and memcache the result – because there’s nothing worse than a 15-second page load.
All you need to remember is this: perceived speed is the only test of speed.
Photo: Burt Munro (Anthony Hopkins) on an Indian Scout from here