Friday, September 19, 2008

Burn your CPU Cycles

Doug (pictured) and I are working on a few new technical approaches at the moment, aiming to meet some stringent functional targets we've set for ourselves [1]. This has led to some curious insights, especially around how our approach is affected by some ingrained prejudices.

Whilst I'm hardly near being tagged with elder-statesmen euphemism, I must admit that I've got an antiquated aversion to certain types of waste - wasted CPU cycles and wasted memory [2]. This is perhaps a side-effect of coming into technology via embedded programming. Either way, even after years of working in and around resource-hungry Java environments, I've never shaken the feeling that it's somehow lazy or wrong.

Paul Graham makes reference to good and bad waste in his essay, The Hundred-Year Language. He argues about the beneficial tradeoff of giving up some performance for elegance. We certainly subscribe to this. However, it's sometimes hard to narrow down the good from the bad [3]. For me, a good example is batch versus online (realtime) processes. If you've ever worked in technology at a large organisation, you've probably seen your share of batch processing.

Now, there are good reasons for batch processes. However, in my experience, it's more often a form of institutional laziness. Online integration is complex, harder to test and usually much more expensive to develop (upfront cost). A simple batch system is really simple. On the other hand, batch systems can become operational nightmares (ongoing cost). Compounding this, once you get beyond a certain threshold with batch processes the complexity goes through the roof [4, 5]. You can end up processing a lot of data that hasn't changed and simply didn't need processing. Even so, organisations still plough on with batch systems.

However, there is another key to all of this that is sometimes overlooked - your batch process is probably leading to a poor end user experience.

If you've ever requested something and been told "Sorry, it won't be processed until the next business day", then there is probably a batch system somewhere to blame. At one organisation I was at, you could change you address online (great!), but it took three days for this to actually be trickle down to all the necessary downstream systems. You can probably guess that this regularly led to some unhappy customers.

When Doug and I are working on these targets, we have come up with few ideas that will be processing intensive. The instant instinct was always to think "no way, that'll use a whole lot of resources". Usually followed by "... what a waste.". However, is it really a waste? I've come to the conclusion that this is not the right way to look at the problem.

This type of mentality locks you into giving users what they have today. You're defining parameters around the user's experience that align with some (subconscious) mental model of a level of resource consumption. The fact is that the next generation of successful applications will be the ones giving users more... And to give them more, you're going to have to crank up the output.

In my view, a lot of developers are overly fixated with performance when they should have a focus on scale. Performance is important, but scale is just fundamental. If you can give your users a better experience, and your application can scale out - then it's easy to see that you're probably on a winner [6].

If you want to add that Web 2.0 tagging system to your widget, then you're going to have to user more resources. Working on a better search? Well, I think it would be a certain bet that it's going to be using a lot more resource. I recall when I first saw YouTube my immediate reaction was "they're crazy - this is going to crash and burn - they're just going to waste a whole lot of money on bandwidth". My fixation with the bandwidth consumption blinded me to the fact that it was giving people something that they really wanted [7].

So we have a new rule here - the computer works for the user. If we hit an example where we're using more resources, but improving the end user experience as a result - we take that as a sign that we're on the right track. We even push it where we can - We look for opportunities to use more resources, asking ourselves if it's improving the outcome for the user.

... Then the challenge for us is to make it scale.

jon@jodoro.com


[1] Partially to give ourselves a stretch target to broaden our thinking, but primarily to attain a target user experience.
[2] This reminds me of my Grandmother and her Post-WWII frugality. She couldn't tolerate waste of any kind. Much to her annoyance, I could never like Bubble and Squeak.
[3] If you've ever worked at a project-driven organisation you'll be aware of the bias to run a project cheaply, usually ignoring the higher long term cost.
[4] When I was at a large UK bank the batch used to run overnight - It was a pretty busy batch window - after everything ran overnight they only had 14 minutes to spare, so god help them if something went wrong. These same batch windows were also an impediment to doing simple things - like full 24/7 online banking, or opening branches on a Saturday.
[5] I also recall some developers who used to maintain a large batch operation. They started out using Microsoft Project to control all the windows and the dependencies (using the leveling feature). Eventually it got so complex they had to write a bespoke tool to maintain the schedules - let alone the complexity of actually monitoring and maintaining it.
[6] Naturally, there is a cost-benefit ratio coming in there somewhere. If you've ever needed to fix an application that doesn't scale, as opposed to one that is just plain resource hungry then you'll know there is a world of difference.
[7] I had a similarly incorrect first reaction to Flickr.

2 comments:

Abhishek Mishra said...

saw your processing monsters link at HN, got interested in processing, look what i did.. http://ideamonk.blogspot.com/2008/11/some-work-in-process-progress.html

well i've just started, eager to program interactions in processing :)

Paul Szymkowiak said...

>Now, there are good reasons for batch processes. However, in my experience, it's more often a form of institutional laziness.


I agree that can be true; and I think there are other possible considerations at play.

I think another is related to change (due to fear or laziness). Batch processes have been around for longer: its really where we started in software. Older institutional tools, processes and people typically hark back to those beginnings. So its what is known, tried and trusted. Yes, online processing is often seen as riskier because its more complex, but also because its not as well known.