ON THE HISTORY OF COMPUTABILITY

The theory of the computability has many roots, however, it began to take significant shape at the very beginning of the 20th century. Logicians and philosophers prior to that time had been busy trying to sort out a secure foundation for mathematics. This development underwent a series of embarassing revisions in order to patch up "paradoxes" (or antinomes) which kept on being discovered in the systems being proposed. A famous example is Russell's paradox which indicated that one had to be very careful about how one defined sets. Nonetheless Mathematicians and logicians built several sophisticated logical systems which codified reasoning. By the turn of the 20th century these foundational developments had matured into the development of set theory and the use of logical systems had become the norm.

At the beginning of the 20th century German mathematics dominated the intellectual world. At the international Congress of Mathematicians in 1900 the German mathematician Hilbert presented an address which not only posed a series of open mathematical problems which have stood the test of time but also exposed his philosophy of mathematics. He believed that mathematics could -- and indeed should -- be supported by a detailed and precise notion of proof. Furthermore, he proposed that every problem in mathematics by its nature is

These remarks spurred a number of researchers to further develop the precise methodology of mathematical proofs: the subject is now called proof theory and it is a rich and fascinating area with much yet to be understood. What no one at that time even imagined was that the philosophical position put forward by Hilbert would prove to be completely wrong! Furthermore, that it would be provably wrong using the very methods Hilbert was advocating! However, it took some thirty years before this was realized by the Czech-born mathematician Godel. Godel's incompleteness theorem showed that things were not simply either true or false: there must necessarily also be

Although it was understood that the flaw in Hilbert's philosophy was directly linked to the nature of computation, it took some time for this aspect to be made explicit. The reason for this was simply that, while the notion of proof had now received considerable attention, the notion of computation still remained rather murky and mysterious. However, technology was fast catching up with mathematics and practical computing devices were already being produced and exciting the minds of the new generation of thinkers. The first world war had undermined German Mathematical, Scientific, and Engineering supremacy: the rest of the western world was busy digesting German advances and moving forward themselves. This was particularly so in Europe (outside Germany), in Scandinavia, in England, and in the rapidly industrializing America.

Alan Turing was one of the most brilliant product of this intellectual soup. Even as a schoolboy, Turing was reading German mathematics. By the time he was an undergraduate at Cambridge his mind was buzzing with the most current ideas of his time. At Cambridge he invented an incredibly simple mathematical device, we now know as the Turing machine, to exemplify the notion of computation. He demonstrated the sense in which it could perform all computations and showed that its halting problem was undecidable.

Meanwhile, Alonso Church, very much influenced by the German model of formalizing mathematics, had invented the lambda calculus as a foundational language. Unfortunately, it had turned out to suffer from a number paradoxes. Church had, in a very pragmatic move, simplified the system down to something which, even if it was a good deal less expressive than he had originally intended, was provably consistent. He and his students (Kleene and Rosser) subsequently realized that this system allowed the expression of all computable functions. In fact, prior to Turing's remarkable feat, they had already shown that this embodiment of computable functions produced some undecidable problems.

Not surprisingly, Alan Turing was invited to Princeton to work with Alonso Church. Together they showed that all notions of computation, which were being considered at that time, were equivalent. After Princeton Turing returned to Cambridge, there his academic career was overtaken by the second world war. His contribution to the allied war effort, at Blechley Park, by breaking German codes, was a closely held secret but pivotal in the war which consumed every corner of Europe -- and ultimately most of the developed world.

It may surprise you how mathematical those roots are ... in a sense Computer Science, as it is so often taught in North America, is a travesty of where the subject started and of the deep philosophical and technical problems with which founding computer scientists, like Church, Post, Turing, Curry, and even that somewhat ambitious and contraversial figure von Neumann were preoccupied. Much of modern computer science, as it is presented, is more reminiscent of the comfortable commercial image promoted by IBM: computers are number crunching, vote counting, mass storage, word processing, clever communication machines. Graphics, web pages, GUI's, and networks are where it is all at!

We tend to forget about the great philosophical debates which at the turn of last century rocked the foundations of mathematics leading to a new view of mathematics and to the search for better foundations for mathematics. Furthermore we tend to forget, amidst all the things that computing technology brings, that this search is by no means over! These foundational issues, which now seem so remote and abstract, continue to have a direct impact now. Our view of computation and how it should be organized is directly influenced by our understanding and approach to these foundational issues.