History of Computer
History of Computer
Computer History
The abacus, a simple counting aid, may have been invented in Babylonia (now Iraq) in the
fourth century B.C.
This device allows users to make computations using a system of sliding beads arranged on a
rack.
In 1642, the French mathematician and philosopher Blaise Pascal invented a calculating device
that would come to be called the "Adding Machine".
One of the first and earliest mechanical devices used for calculating was the Pascaline
Originally called a "numerical wheel calculator" or the "Pascaline", Pascal's invention utilized
a train of 8 moveable dials or cogs to add sums of up to 8 figures long. As one dial turned 10
notches - or a complete revolution - it mechanically turned the next dial.
Pascal's mechanical Adding Machine automated the process of calculation. Although slow by
modern standards, this machine did provide a fair degree of accuracy and speed.
In 1822, he persuaded the British government to finance his design to build a machine that
would calculate tables for logarithms. Called the “Difference Engine.”
Device was to calculate numbers to 20th place and print them at 4 digits per minute.
With Charles Babbage's creation of the "Analytical Engine", (1833) computers took the form
of a general-purpose machine.
Analytical Engine
• 1833
U.S. Census
Tabulating Machine
Tabulating Machine
• Used electricity rather than mechanical gears
• The location of each hole represented a specific piece of information (male vs. female)
• Cards inserted into the machine and metal pins used to open and close electrical circuits
Aiken thought he could create a modern and functioning model of Babbage's Analytical
Engine.
He succeeded in securing a grant of 1 million dollars for his proposed Automatic Sequence
Calculator; the Mark I for short. From IBM.
In 1944, the Mark I was "switched" on. Aiken's colossal machine spanned 51 feet in length and
8 feet in height. 500 meters of wiring were required to connect each component.
The Mark I did transform Babbage's dream into reality and did succeed in putting IBM's name
on the forefront of the computer industry. From 1944 on, modern computers would forever
be associated with digital intelligence.
Mark I Calculator
• One early success was the Harvard Mark I computer which was built as a partnership between
Harvard and IBM in 1944. This was the first programmable digital computer made in the U.S.
But it was not a purely electronic computer. Instead, the Mark I was constructed out of
switches, relays, rotating shafts, and clutches. The machine weighed 5 tons, incorporated 500
miles of wire, was 8 feet tall and 51 feet long, and had a 50 ft rotating shaft running its length,
turned by a 5-horsepower electric motor. The Mark I ran non-stop for 15 years, sounding like
a roomful of ladies knitting. To appreciate the scale of this machine, note the four typewriters
in the foreground of the following photo.
1939-1942
• First electronic computer built by John Atanasoff and Clifford Berry
ENIAC - 1946
a machine that computed at speeds 1,000 times faster than the Mark I was capable of only 2
years earlier.
Using 18,00-19,000 vacuum tubes, 70,000 resistors and 5 million soldered joints this massive
instrument required the output of a small power station to operate it.
It could do nuclear physics calculations (in two hours) which it would have taken 100
engineers a year to do by hand.
Weighed 30 tons and was 1500 square feet (average area of a 3-bedroom house.
Computer
• An electronic machine accepts data, processes it according to instructions and provides the
results as new data
Program
1930’s – 1940’s
• He envisioned a computer that could perform any different tasks by simply changing a
program rather than by changing electronic components
TRANSISTOR (1947)
In the laboratories of Bell Telephone, John Bardeen, Walter Brattain and William Shockley
discovered the "transfer resistor"; later labeled the transistor.
Advantages:
increased reliability
This tiny device had a huge impact on and extensive implications for modern computers. In
1956, the transistor won its creators the Noble Peace Prize for their invention.
ALTAIR (1975)
The invention of the transistor made computers smaller, cheaper and more reliable.
Therefore, the stage was set for the entrance of the computer into the domestic realm. In
1975, the age of personal computers commenced.
Under the leadership of Ed Roberts the Micro Instrumentation and Telemetry Company (MITS)
wanted to design a computer 'kit' for the home hobbyist.
• In 1970 John Huff invented the microprocessor, an entire CPU on a single chip.
This allowed for the building of a microcomputer or personal computer.
ALTAIR (1975)
Based on the Intel 8080 processor, capable of controlling 64 kilobytes of memory, the MITS
Altair - as the invention was later called - was debuted on the cover of the January edition of
Popular Electronics magazine.
Presenting the Altair as an unassembled kit kept costs to a minimum. Therefore, the company
was able to offer this model for only $395. Supply could not keep up with demand.
ALTAIR FACTS:
No Keyboard
No Video Display
No Storage Device
Using the 16-bit Intel 8088 microprocessor, allowed for increased speed and huge amounts of
memory.
Unlike the Altair that was sold as unassembled computer kits, IBM sold its "ready-made"
machine through retailers and by qualified salespeople.
To satisfy consumer appetites and to increase usability, IBM gave prototype IBM PCs to a
number of major software companies.
For the first time, small companies and individuals who never would have imagined owning a
"personal" computer were now opened to the computer world
MACINTOSH (1984)
IBM's major competitor was a company lead by Steve Wozniak and Steve Jobs; the Apple
Computer Inc.
This system differed from its predecessors in its use of a "mouse" - then a quite foreign
computer instrument - in lieu of manually typing commands.
However, the outrageous price of the Lisa kept it out of reach for many computer buyers.
Apple's brainchild was the Macintosh. Like the Lisa, the Macintosh too would make use of a
graphical user interface.
The GUI (Graphical User Interface) made the system easy to use.
The Apple Macintosh debuts in 1984. It features a simple, graphical interface, uses the 8-MHz,
32-bit Motorola 68000 CPU, and has a built-in 9-inch B/W screen.
Cost $2,495