One of the most interesting problems to be found in Computer Aided Design (CAD) is the Routing problem, the most common form is in Printed Circuit Boards (PCBs). The problem consists of finding routes between the pins of electrical components that need to be connected to form the circuit. Routes cannot cross each other, since they would then short circuit the connection; the problem is made easier by utilising multi-layer boards, with plated through holes connecting between layers. In this way, routes can cross each other providing they are on different layers. The problem is still difficult because of the large number of connections (possibly well over 1000), and the high packing density of the components. Many techniques have evolved, with modern routers using 'Rip up and Retry' methods, but the one I want to briefly describe is an old method known as 'Lee's Algorithm'.
Lee was a mathematician who was the first person to successfully attack the routing problem using a formal approach. Lee used a regular cellular grid to map the PCB, each cell containing information as to whether it was occupied or empty. Using his algorithm, a 'flood' is built from rings of adjacent cells until the other end, or 'target' is reached. The flood is somewhat similar to a wavefront that moves across the grid, skirting round obstacles. When the target is reached, a 'Backtrack' process is performed, finding the best route available between the start and end, using the information in the flood. A variation is to flood from both ends simultaneously, stopping when the two wavefronts touch each other, and Lee's original two layer work was extended to cover multiple layers.
The point of interest I want to explore is that the flood is removed after the backtracking, and the ground is cleared ready for the flood for the next route. This process of flooding, backtracking and clearing is one that occurs in other areas of human endeavour.
Similar techniques have been used in Speech Recognition, where a matrix is built up to represent a non linear mapping of a speech fragment, broken into different frequency bands. To find the best match between the fragment and a set of stored patterns, a 'contour' is built over the matrix, and again a backtrack is used to find the optimum 'least cost' path.
This backtracking will come as no surprise to computer scientists, since it is a generic method used in a number of problem areas; but we can take the analogy further.
Explorers and scouts use a similar method when trying to find the best route through surrounding terrain, first investigating the peaks and valleys by viewing from the high points, and then mapping out a path from a selected point back to a start point.
The similarity between human geographical exploration and mathematical or numerical exploration of a problem space has often been used to suggest possible approaches in path finding, or the related problem of optimisation (seeking a high point in the terrain).
In almost any form of human endeavour, where some known or unknown goal is being searched for, there is evidence of the same kind of behaviour.
For example, when mathematicians or physicists are researching a new problem area, there is often much exploratory work that goes on before a path becomes clear. Sometimes this work is recorded in the form of tentative theories or experiments, but often it remains private to the original worker, either in rough notes, or even in his head.
It has been noted by research workers themselves that the final published work often gives a misleading picture of how a new result was arrived at. The often confusing meandering path that originally took the worker to the desired goal is cleaned up and re-presented for the sake of clarity and brevity. To be sure, there are good reasons for taking this approach, since the final published method may be stronger in terms of its relevance and importance to other on going research, but the other side of the coin is that a historical understanding of the thinking that went into the problem is lost in the process.
When James Clerk Maxwell produced his famous work on electromagnetism, the theory was couched in terms of the now well known electromagnetic field. The field pervades space, has strength and direction at all points in space, and provides the mechanism for electric and magnetic forces. Not being equipped with modern vector and tensor methods, Maxwell had to cover considerable paper real estate with equations. Nowadays, the derivation of the electromagnetic wave equation from the field equations takes an undergraduate student all of half a sheet of A4, by using the powerful and concise vector operators grad, div and curl. A lesser known fact is that, on his way to the field theory, Maxwell tried to solve the problem using pseudo mechanical 'vortices' that carried the forces across empty space. This approach was quite understandable, bearing in mind the Newtonian mechanical world view that prevailed at that time - the idea of empty space fields being able to communicate force was difficult to swallow. It is interesting to observe that the later ideas on communicating forces (in the form of gauge theories) have adopted the exchange of mediating particles such as photons, gluons and gravitons.
I am not trying to say that Maxwell's early attempts were more correct, or that he foresaw gauge theories, but that many workers in the field would have been unaware of Maxwell's early thinking, and hence missed ideas which may have given them food for thought.
The 'explore and backtrack' method doesn't just show up in the difference between preliminary and published work, it often forms the whole basis of the research thinking. It is impossible to generalise on methods of thinking, the famous French mathematician Poincare' noted several different ways in which he arrived at his results, varying from fairly predictable methodical attacks to flashes of inspiration 'out of the blue'. What is clear however is that there is often a large component of what the English mathematician Littlewood referred to as 'vague thinking' - an attempt to describe the general, largely unconstrained phase of thinking about a problem while trying to keep all aspects in mind.
The explore and backtrack phases are often repeated in a cycle, each time first relaxing the constraints, and then tightening them. It has been observed that the process of annealing metals by heating and cooling them alternately has a more than passing resemblance to these cycles, and in fact there have been methods of optimisation which introduce random perturbations (similar to the 'noise' of heated molecules) into the search methods, so that they should not become stuck in local minima.
Physics and Mathematics are not the only activities in which this dual kind of development is utilised - the creative artist also sometimes works in this way in art, music and literature. The initial creative flow is more of a holistic, existential activity, the artist experiences his work in some way as a complete entity. Detailed construction and refinement are then employed, and have more of an analytical feel to them.
In the description of these two phases of activity there is something very reminiscent of the general characteristics of the two halves of the brain -the left half being the (normally dominant) analytical, coherent side, and the right half being the 'creative', visual side. There is also a sense in which the left half is more 'serial' since for example it handles serial speech communication, whereas the right half is more 'parallel' since it handles two and three dimensional visualisation. It does not seem too far fetched to suggest that the two halves of the brain are responsible for the two types of thinking I have outlined - or at least, that they make unequal contributions to the two types.
There could be the grounds here for an argument as to why evolution has produced us with such a 'lopsided' brain, perhaps successful planning requires these two halves for fundamental reasons. My guess is that there has to be some such dual approach in general, not just that it happens to be a reasonably useful method. Perhaps I am influenced by the Eastern ideas on Ying-Yang duality, where it is the nature of things that opposite poles could not exist without their other half, depending on each other for their mutual existence.
If we take a step back and look at developments in a historical perspective, we can see a similar mechanism at work in the long term. Historical interpretation must always introduce some rethinking by the interpreter, and there is a type of historical view referred to as Whiggish which is said to colour the original events so strongly by later paradigms that they cannot be regarded as giving a true picture. Modern historians have risen above such dated approaches, and now take an objective rather than judgemnetal view - except, that is, in their view of the Whig historians.
Re-interpretation is not confined to historians, Richard Feynman refers to the 'physicists history of physics', which tells how various theories have developed. These mythologies usually involve a large amount of rearranging of historical facts which get in the way of painting a nice clear picture. Plus ca change...