Friday, 15 June 2007

Why is progress in computer software so glacial?

It seems that we're still living out the tail-end of Romanticism. Nowhere is this more evident in the UK than in the popular obsession with cookery, and everything which appertains to it. Cooks, cooking, and cookery books dominate the bestseller lists and the TV schedules. It's the country's prime artistic medium and means of self-expression, and the public is infatuated with celebrity chefs and their dymamic, anti-conventional, artist-hero personas. Gordon Ramsay is Byron, and Heston Blumenthal is Shelley (in the sense of Shelley's experiments with electricity and magnetism, not his interest in proto-socialism, casual sex with jailbait, and advocacy of vegetarianism).

Behind the scenes, of course, things are rather different from the romantic-hero image. Deluxe restaurants are slickly-run operations, far more regimented and rigorous than the average business. The working day is very long, and precisely scheduled. There are complicated and well-protected supply chains. Marketing is vital. There's a clear line of command, and everyone's job is a small cog in the big machine - clear, repeatable, endlessly repeated. The temperatures are sweltering but, in spirit, the chef de cuisine, sous chef, chef de partie, rotisseur, second chef, aboyeur etc. all need to be "cucumber-cool machine minders" (Auden, Horae Canonicae).

Apart from a complete lack of public interest, the position in IT is almost identical. The public perception - inasmuch as there is one - is of the lone hacker, the guru. The reality is a series of highly segmented jobs, of increasingly minute specialisation. No one "knows about computers", or is competent to do more than a handful of jobs in the field. The system administrators can't program, and the programmers would cause utter chaos if they were allowed to fiddle with your company's IT infrastructure. Each of the programmers has an area of specialisation about which the others know little or nothing. The boss can't do any of what his team does. (The more someone earns in IT, the less likely they are to be able to fix your computer.)

In this segmented world, most of the larger entities which produce software - either for internal use, or for sale - separate out the jobs of designing the software and writing the actual code. (Most of them segment things still further, and have one person to identify a "business need" and another to design the software which meets that need. Someone else then codes it. Salaries go in descending order.)

These designers (mostly) can't write code. They (mostly) know no more about programming than you do - barring a sort of cargo-cult knowledge acquired from sitting next to the bearded blokes in T-shirts every day.

Therefore, the designers don't know what's technologically possible and impossible. They only know what they've seen done before. Their designs are necessarily mimetic. They'll produce solid, sensible designs in an accepted mould all day long, but they're never going to innovate. Their employers like this. They're risk-averse. That's why they segment the jobs. Which is fine for them, but it means that progress in computer software en masse is glacial.

Therefore, innovation and improvement in computer software almost always comes when there's no - or very, very little - separation between the people designing the software and the people writing the code. That tends to imply small, young companies - early Apple, or early Google. They keep the fire burning as they grow larger by buying other small, young companies (though Google have tried to institutionalise the phenomenon through their famous "20% time").

Or Matchpeg. We'll never have a job title including the words "Analyst" or "Architect". "Product Manager" is also verboten. Deliver results.

(Part of an occasional series on matters technological. It beats doing real work.)