Backward Compatibilty                 Apr 2007

 

 

Many people complain endlessly about the evils of Microsoft, or of ‘Wintel’ – the duopoly formed by Windows and Intel. The software is clumsy, full of bugs, overweight, prone to security hacks and so on.

 

Now quite a lot of this is true, but you don’t often hear people discussing the reasons why it is so. Trust me, Microsoft does not set out to produce clumsy inefficient software; and one thing most everyone agrees on is that they have a lot of very smart people working for them. So why don’t they just make it a lot better?

 

There are a couple or three reasons why Windows is the way it is, and I would like to explore them a little.

 

Firstly, you have to realise that Microsoft has to support an absolutely massive marketplace. I don’t know the number of PC’s in the world right now; it changes daily and must be several hundred million, maybe even approaching a billion. Not all of them are running Windows, but an awful lot of them are.

 

Anyone who has developed software for end users knows that people are dashed awkward customers; they will persist in using the software in ways that it was never intended to be used. Sometimes this is accidental, sometimes deliberate, and sometimes just plain perverse. Interestingly, users can sometimes find ways of doing clever things with software that the developers never predicted or planned for, and so extend the power of the product beyond its design envelope. But many times they will find holes in the software that nobody else has.

 

So why doesn’t Microsoft test its software? Well, of course it does. I understand that the software testing facility at Microsoft extends over tens of thousands of square feet with thousands of different hardware / software test combinations, and employs more people than the combined development teams. Having worked for a software company myself, I can well believe it. Modern software systems are among the most complex structures ever created on this planet. Testing them is incredibly difficult, and is made much more so by the huge number of hardware combinations produced by PC peripheral companies. You can create an almost infinite number of different configurations of hardware, together with third party software packages. So it becomes virtually impossible to completely test all possible operations of the software on all possible platform combinations. This is why companies like Microsoft rely heavily on releasing beta versions to the development community, effectively increasing the testing population many fold.

 

As if that wasn’t bad enough, there is also the problem nowadays of deliberate attacks on the software by hackers who will spend endless amounts of time and intellectual effort to find ways to subvert new versions of Windows and its main applications. And though Microsoft can try to predict and prevent these attacks, it is a well known adage that attack always manages to keep one step ahead of defence. People often point to the superiority of Apple and Linux systems because they do not have security loopholes like Windows. But this conveniently ignores the fact that the massive number of Windows systems in the world makes them a much more tempting and juicy target than either Apple or Linux. That is not to say that these other systems are not inherently more secure – they probably are – but you cannot be really sure of that unless they are exposed to the same level and intensity of attack.

 

One of the reasons that Apple and Linux may be more secure is that they have grown up through a different development path. For one thing, they do not have (in Apples case, anyway) to support the same huge number of possible configurations that Microsoft does. One of the reasons (and there are several) that Linux does not rule the world like Windows is that it simply does not support all the things that Windows does. I know myself that, though I might be sometimes tempted to switch operating systems, that it would almost certainly leave me with some bits of hardware or software that would no longer work. And that is a much bigger problem than the occasional glitches or virus attacks that I might get with Windows. One can, after all, take care of these with suitable software packages.

 

Of course, there are times when I sigh and wish I could get hold of a nice, clean, elegant, efficient, effective, reliable operating system. I used to dream about developing one myself, but that task has long since grown to the level where that is impossible.

 

Now I would like to mention another major reason why things like Windows are the way they are – backward compatibility. True, you can no longer run all the DOS programs you used to have (especially the games); but you can still run many programs that are years old.  Big though Microsoft is, it simply has to keep this in mind all the time. If it produced a new Version of Windows that did not support the massive number of extant Windows applications, it would rapidly go out of business.

 

Now this doesn’t stop them from reconstructing the base of their software, because they can do this and still keep the same interface to applications software (API as its known in the trade). Or at least, that is the theory. In practice, it is not as simple as that, and any major internal change is likely to cause considerable grief. And making major changes to the large complex pile of software that Windows has become is always going to be problematical.

 

Sometimes it is possible to completely rewrite the internals of a software package, but this is a pretty brave step to take, and normally only during the early life of a product before the user base has grown too large to take risks with.

 

Some of the same constraints apply to the hardware. When Intel set out on its long path of microprocessor development (not all of which it saw coming), it started out with a very simple 8 bit processor, the 8080. For many years it increased the complexity of is processors (8086, 80286, 80386 , 80432 – whoops, never made it to the market), while still keeping the same basic architecture. So each new processor had to support the instruction set and addressing scheme of the old one. This nearly lost Intel the lead for a while when Motorola produced the best chip of its time, the 68000, which had a much cleaner address architecture than Intel. To be honest, I am not sure now how much the latest Core Duo Pentiums still have embedded in them the traces of the 8080, but I suspect there are still some.

 

There are many other examples of how backward compatibility can stifle change and development. The problem is always acute when there is a massive installed base of any product; the cost of replacing all this is often too great to make the change worthwhile. Take electrical power sockets for example. When I travel the world, I sigh each time I have to pack the correct power adaptors depending on where I am going. The UK has clunky 3 flat pin sockets, Europe has 2 round pins, the US 2 flat parallel ones, Australia flat non parallel and South Africa, Japan, India, Israel are different again. For details see www.interpower.com/ic/guide.htm. Wouldn’t it be great if they were all the same! Some chance. Of course national pride and sheer obstinacy can get in the way too, but no country is keen to force its entire population to throw away all its power plugs – though this did happen in the UK in my lifetime.

 

Likewise, the telephone system that we still use (Skype / Voip excepted) dates back to the first days that phones were invented. The phones are updated, but the signals are not. A completely new phone scheme was created, specified and had semiconductor chips designed for it (ISDN), but it never really took off – at least in the UK. We are so used to the crappy voice quality of phones that we don’t expect anything better – but it could be, except for backward compatibility issues.

 

It is tempting to think that this is a problem of human engineering; if only we were smarter we could avoid these issues. Actually it is simply a fundamental problem of any complex system that changes and develops. The most complex system that we know is the human brain. Even now it is hard to say exactly how complex it is. We can count the neurons and synapses, but the complexity of the connection architecture is tricky to measure. And there are some people (Roger Penrose for one) who suspect that the computation ability of the brain may rely on some quirky quantum behaviour; though my guess is that the connections are complex enough. So does the brain exhibit backward compatibility? Sure it does. The brain consists of a number of layers, from the cortex down to the hippocampus and thalamus, each one dating back to a previous time in evolutionary history. It simply was not possible to throw away the previous ‘design’ and start again. For one thing, the creatures which the brain was living in needed to stay alive and reproduce while the changes were taking place. This was a pretty good reason for keeping the old tried and tested low level brain bits before adding any new fancy ones.

 

Sometimes though there is revolution rather than evolution, so CDs replaced vinyl and cassettes, and DVDs replaced VHS. The technical improvements were so great that the cost of replacement seemed justified – to most of us at least. Even here we have backward compatibility since DVD players can play CDs (and MP3, JPEG etc.). Time will tell how HDDVD or Blueray will succeed. My personal bet is on HDDVD purely because it sounds more like DVD and so may confuse the customer less.

 

Where does this leave us with the world of the PC and Wintel? I have often wondered where a revolution in personal computing could come from. It is possible that a much smarter, more friendly computer could be created, using massive parallel power and ideas from neural networks maybe. But I cannot see that coming from Microsoft and Intel (well, maybe Intel). Backward compatibility will exert too strong a hold to allow that degree of freedom from their evolutionary path. If we ever see such an animal (and I hope we do) it seems to be more likely that it will come from a new source – names like Ray Kurzweil and Steve Grand come to mind. Here’s hoping.

 

 

 

 

 

Next Essay

Back to List of Essays

Back to Home Page