Who Invented the Computer? |
The machine which can draw graphics, set up your modem, decipher your PGP, do typography, refresh your screen, monitor your keyboard, manage the performance of all these in synchrony... and do all of these through a single principle: reading programs placed in its storage.
But the meaning of the word has changed in time. In the 1930s and 1940s "a computer" meant a person doing calculations, and to indicate a machine doing calculations you would say "automatic computer". In the 1960s people still talked about the digital computer as opposed to the analog computer.
As in my book, I'm going to use the word "computer" to indicate only the type of machine which has swept everything else away in its path: the computer in front of you, the digital computer with "internally stored modifiable program."
That means I don't count the abacus or Pascal's adding machine as a computer, important as they may be in the history of thought and technology. I'd call them "calculators".
I wouldn't even call Charles Babbage's 1840s design for the Analytical Engine the design for a computer. It didn't incorporate the vital idea which is now exploited by the computer in the modern sense, the idea of storing programs in the same form as data and intermediate working. His machine was designed to store programs on cards, while the working was to be done by mechanical cogs and wheels. There were other differences -- he did not have electronics or even electricity, and he still thought in base-10 arithmetic.
But more fundamental is the rigid separation of instructions and data in Babbage's thought.
Charles Babbage, 1791-1871
|
A hundred years later, the analysis of logical operations, started by George Boole, was much more advanced. Electromagnetic relays could be used instead of gearwheels. But no-one had advanced on Babbage's principle. Builders of large calculators might put the program on a roll of punched paper rather than cards, but the idea was the same: machinery to do arithmetic, and instructions coded in some other form, somewhere else, designed to make the machinery work.
To see how different this is from a computer, think of what happens when you want a new piece of software. You can ftp it from a remote source, and it is transmitted by the same means as email or any other form of data. You may apply an UnStuffIt or GZip program to it when it arrives, and this means operating on the program you have ordered. For filing, encoding, transmitting, copying, a program is no different from any other kind of data - it is just a sequence of electronic on-or-off states which lives on hard disk or RAM along with everything else.
The people who built big electromachanical calculators in the 1930s and 1940s didn't think of anything like this.
Even when they turned to electronics, they still thought of programs as something quite different from numbers, and stored them in quite a different, inflexible, way. So the ENIAC, started in 1943, was a massive electronic calculating machine, but I would not call it a computer in the modern sense. Perhaps we could call it a near-computer.
But it was crucial in showing Alan Turing the speed and reliability of electronics. It was also ahead of American technology, which only had the comparable ENIAC calculator fully working in 1946, by which time its design was completely obsolete. (And the Colossus played a fantastic part in defeating Nazi Germany by reading Hitler's messages, whilst the ENIAC did nothing in the war effort.)
1996 saw the fiftieth anniversary of the ENIAC. The University of Pennsylvania and the Smithsonian made a great deal of it as the "birth of the Information Age". Vice-President Gore and other dignitaries were involved. Good for them.
At Bletchley Park Museum, the reconstruction of the Colossus came from Tony Sale's own devoted time and money, and the local council still won't put up road signs pointing to the hard-to-find Museum. (There is however a map on the Web.) The National Lottery has now rejected an application by the Bletchley Park Museum for money to secure its future.
Americans and Brits do things differently. Some things haven't changed in fifty years.
John von Neumann, 1903-1957John von Neumann (originally Hungarian --- Janos) was a major twentieth-century mathematician with work in many fields unrelated to computers. These biographies focus only on the contribution he made to computing after 1944:
|
The EDVAC Report became well known and well publicised, and is usually counted as the origin of the computer in the modern sense. It was dated June 1945 -- before Turing's report was written --- so in the game of scientific priority, von Neumann came first.
However, what Alan Turing wrote in the autumn of 1945 was independent of the EDVAC proposal, and it was much further ahead.
That's because he based his ideas on what he had seen in 1936 --- the concept of the universal machine. In the abstract universal machine of 1936 the programs were written on the store in just the same way as the data and the working. This was no coincidence. Turing's discoveries in mathematical logic, using the Turing machine concept, depended on seeing the principle that programs operating on numbers could themselves be represented as numbers.
But Turing's 1945 conception of the computer was not tied to numbers at all. It was for the logical manipulation of symbols of any kind. From the start he stressed that a universal machine could switch at a moment's notice from arithmetic to the algebra of group theory, to chess playing, or to data processing. He saw immediately the first ideas of program structure and languages. Von Neumann's design was in contrast concerned with doing massive amounts of arithmetic, and he was never interested in programming.
Von Neumann was in the business of calculating for the atomic bomb and for artillery tables. Alan Turing came fresh from codebreaking, work on symbols which wasn't necessarily to do with arithmetic. He had seen a vast establishment built up with special machines organised to do different tasks. Now, he saw, they could all be replaced by programs for a single type of machine.
There are many different views on which aspects of the modern computer are the most central or critical.
But I would say that in 1945 Alan Turing alone grasped everything that was to change computing completely after that date: the universality of his design, the emphasis on programming, the importance of non-numerical applications, the apparently open-ended scope for mechanising intelligence. He did not do so as an isolated dreamer, but as someone who knew about the practicability of large-scale electronics, with hands-on experience.
The idea of one machine for every kind of task was very foreign to the world of 1945. Even ten years later, in 1956, the big chief of the electromagnetic relay calculator at Harvard, Howard Aiken, could write:
If it should turn out that the basic logics of a machine designed for the numerical solution of differential equations coincide with the logics of a machine intended to make bills for a department store, I would regard this as the most amazing coincidence that I have ever encountered.
But that is exactly how it has turned out. It is amazing, although we now have come to take it for granted. But it's not a mere coincidence. It follows from the deep principle that Alan Turing saw in 1936.
IN THE FUTURE . . .
The stored program revolution of 1945 is not the end of the story in computing. Nanotechnology will blur the distinction between hardware and software. So will self-organising programs such as neural nets.
And the internally stored program does not in principle add to the logical power of a calculator. Babbage's design does, in principle, embody the Universal Turing Machine concept.
But in 1945 this freedom to think of programs as data was in practice the liberating change on which fifty years of subsequent development has rested.
New developments will still rest upon the concept of computability defined by Turing in 1936.
(Even when we have quantum computers... but that's another story.)
History is on the side of the winner. In 1945 Alan Turing could have felt like a winner. He was taken on by the National Physical Laboratory at Teddington, in London suburbs. His detailed plan for an electronic computer, with a visionary prospectus for its capacities, was accepted in March 1946. Everything seemed to be going for it.
Well, not quite everything. Turing's plan called for at least 6k bytes of storage, in modern terms, and this was considered far too ambitious.
And the Colossus electronic engineers, now returned to the Post Office, were not able to make a quick start on the plan as he had expected. Still, in late 1946 the NPL put out press releases...
Some of the feats that will be able to be performed by Britain's new electronic brain, which is being developed at the N.P.L., Teddington, were described to the SURREY COMET yesterday by Dr. A. M. Turing, 34-year-old mathematics expert, who is the pioneer of the scheme in this country.
From the local suburban newspaper, the Surrey Comet, 9 November 1946.
|
Furthermore, he did not promote his ideas as effectively as the others. If he had written papers on The Theory and Practice of Computing in 1947, instead of going on to new ideas, he would have done more for his reputation.
He didn't. He went running and he thought about what he saw as the next step: Artificial Intelligence.
After 1948 almost everyone forgot that he had drawn up the design for a computer in 1945-6, the most detailed design then in existence. The mathematician M. H. A. (Max) Newman, when writing his Biographical Memoir of Turing in 1955, passed over this period with the words,
...many circumstances combined to turn his interest to the new automatic computing machines. They were in principle realizations of [Turing's] 'universal machine'... though their designers did not yet know of Turing's work.How could Newman have forgotten that Turing himself was such designer --- in fact the author of the very first detailed design --- and that obviously he knew of his own work? Alan Turing's reputation has been subject to a strange kind of selective memory. I am just proud that now, the computer itself can help set matters right before your eyes.
|
After Turing resigned, a change in management meant that the ACE project could go ahead after all, and a working computer, based on his original design, was operating in 1950. The "Pilot ACE" is now in the Science Museum, London.
From The Times, 25 August 1947 |
Running in 1946 |
While such lists respect the democratic spirit of the Internet by including everyone's contribution, they don't distinguish between different types of material or assess the quality of the information offered, and some of it, in my opinion, is unworthy of the technology that makes it available.
This is not a rant against the Internet: books can be just as bad, and don't offer the same scope for putting things right.
So for instance I find it hard to recommend:
Continue Scrapbook |
Alan Turing Home Page |
Bibliography |
Last updated 23 February 1997.