Lecture Notes CPE 112 WEEK 2
Lecture Notes CPE 112 WEEK 2
Learning Outcomes
3. Contributions of pioneers like Charles Babbage, Alan Turing, and Grace Hopper.
Introduction
The evolution of computers spans several millennia—from primitive counting aids to the
sophisticated, miniaturized devices that power the Information Age. This lecture note
examines how humanity’s need for efficient calculation and information processing led to
inventions that have revolutionized science, commerce, and everyday life. We explore the
evolution of computing devices, review major milestones in computer history, and discuss
the seminal contributions of early pioneers such as Charles Babbage, Alan Turing, and Grace
Hopper.
Abacus Origins: The abacus is one of the earliest known computational aids, dating back to
ancient Babylonia (circa 2700–2300 BC) and later refined in China (suanpan). These devices
used beads on rods to represent numbers and perform arithmetic operations, establishing the
basis for systematic calculation.
Napier’s Bones (Early 17th Century): Invented by John Napier to simplify multiplication and
division by using rods inscribed with numbers, Napier’s Bones were an important stepping
stone that reduced the complexity of arithmetic operations.
Pascal’s Calculator (1642–1644): Blaise Pascal, a child prodigy, built the Pascaline—a gear-
driven calculator capable of addition and subtraction—to assist his father with tax collection.
Approximately 50 machines were produced, demonstrating the practical benefits of
mechanical calculation.
1
Leibniz’s Stepped Reckoner (1672–1673): Gottfried Wilhelm Leibniz advanced the concept
by inventing the stepped reckoner, which could perform multiplication and division through
repeated addition and shifting. Although his design was not perfected, it introduced ideas
(such as carrying) that are fundamental to modern computing.
Jacquard Loom (1801): Joseph-Marie Jacquard revolutionized the textile industry with a
loom controlled by punched cards. These cards stored patterns in a binary (hole/no-hole)
format that directed the machine to weave intricate designs automatically. This method of
storing instructions was a precursor to computer programming.
The Difference Engine (1822): Charles Babbage, often called the "father of the computer,"
conceptualized the Difference Engine to compute mathematical tables (such as logarithms
for navigation). Although never completed in full, its design showcased how automated
mechanical calculation could reduce errors in table-making.
The Analytical Engine (1833): Babbage’s Analytical Engine was a more ambitious project
that featured many concepts found in modern computers: a “mill” (akin to a central
processing unit), a “store” (memory), and a system of punched cards for program instructions
and data input. Despite technological limitations that prevented its full construction, the
engine’s design was revolutionary, anticipating conditional branching and looping.
Punched-Card Systems and Data Processing: In the late 19th century, Herman Hollerith’s
development of punched-card data processing for the 1890 U.S. Census marked the transition
from manual to mechanized data handling. His work laid the foundation for the later
formation of IBM.
2
The Atanasoff–Berry Computer (ABC, 1942): Developed by John Vincent Atanasoff and
Clifford Berry, the ABC was the first electronic digital computing device. It introduced binary
arithmetic and demonstrated that electronic components (vacuum tubes) could perform
calculations far more rapidly than mechanical parts.
Colossus (1943–1944): Designed by Tommy Flowers and his team at Bletchley Park, the
Colossus was the first electronic digital programmable computer. Used primarily for
codebreaking during World War II, it employed thousands of vacuum tubes to rapidly process
intercepted enemy messages. Although not Turing-complete, its design was a critical step in
the evolution of computing.
ENIAC (1943–1945): The Electronic Numerical Integrator and Computer, built by John
Mauchly and J. Presper Eckert at the University of Pennsylvania, was the first general-
purpose electronic digital computer. Weighing 30 tons and using around 18,000 vacuum
tubes, ENIAC was capable of performing 5,000 additions per second and was
reprogrammable via patch cables and switches. It also highlighted the crucial role of
programmers—predominantly women—in configuring the machine for different tasks.
3
Transition to Transistors (Late 1940s–1950s): The invention of the transistor in 1947 by John
Bardeen, Walter Brattain, and William Shockley at Bell Labs marked the beginning of the
second generation of computers. Transistors were smaller, more reliable, and consumed less
power than vacuum tubes. This transition enabled computers to become smaller and more
energy-efficient.
Integrated Circuits and the Microprocessor (1960s–1970s): Further miniaturization came with
the development of integrated circuits (ICs) in the late 1950s by Jack Kilby and Robert
Noyce. The invention of the MOSFET (metal-oxide-semiconductor field-effect transistor) in
1959 and the subsequent advances in chip fabrication paved the way for the microprocessor.
In 1971, Intel released the 4004—the first commercially available microprocessor—which
integrated the entire CPU onto a single chip. This breakthrough was critical to the personal
computer revolution.
Early Personal Computers (Mid-1970s): The mid-1970s saw the introduction of several
personal computers such as the Altair 8800, Commodore PET, and Apple I. These machines
brought computing out of large institutions and into homes and small businesses. The
simplicity of programming and relatively low cost made these devices accessible to hobbyists
and entrepreneurs alike.
Apple II and IBM PC (Late 1970s – Early 1980s): The success of the Apple II, with its color
graphics and open architecture, and the IBM PC, which set standards for business computing,
cemented the role of the personal computer in everyday life. By the 1980s, computers became
integral to education, commerce, and entertainment.
4
Smartphones and the Cloud (2000s–Present): Today, mobile devices—powered by
sophisticated System on a Chip (SoC) technology—serve as pocket-sized computers. These
devices support high-speed internet, multimedia applications, and even complex
computations, demonstrating how far computing has come since the era of room-sized
machines.
Charles Babbage’s Difference and Analytical Engines: Babbage’s designs in the 1820s and
1830s introduced concepts such as automated calculation, memory storage, and
programmability using punched cards. Although never fully built, these machines set the
conceptual stage for all later developments in computing.
Jacquard Loom (1801): The punched card system of the Jacquard loom demonstrated how
instructions could be stored and automatically executed—an idea that would later become
fundamental in computer programming.
ENIAC (1945): As the world’s first general-purpose electronic digital computer, ENIAC was
a monumental achievement. Its ability to perform thousands of operations per second
transformed scientific computation and demonstrated the power of electronic processing. Its
reprogrammability (albeit through complex rewiring) was a precursor to modern software-
driven machines.
The Stored-Program Concept and Manchester Baby (1948): The shift to storing programs in
memory revolutionized computing. The Manchester Baby proved that a computer could use
its memory to store instructions and data, leading directly to the development of
commercially viable machines such as the Ferranti Mark 1.
Transistors and Integrated Circuits (Late 1940s–1960s): The move from vacuum tubes to
transistors marked a drastic improvement in speed, size, reliability, and energy consumption.
The subsequent development of integrated circuits enabled the miniaturization of electronic
components, laying the foundation for the microprocessor and the personal computer
revolution.
5
Introduction of the Microprocessor (1971): The Intel 4004 microprocessor encapsulated the
entire CPU on a single chip, enabling the production of affordable, compact personal
computers.
Rise of Home Computers (Late 1970s–1980s): Personal computers such as the Apple II,
Commodore PET, and IBM PC brought computing to the masses, transforming education,
business, and everyday life.
Mobile Computing and the Internet (1990s–Present): The evolution from desktops to laptops,
and eventually to smartphones, combined with the explosion of the Internet and cloud
computing, has made powerful computation accessible virtually anywhere.
Innovative Designs: Babbage’s concepts of the Difference Engine and Analytical Engine
introduced critical ideas like automated calculation, data storage, and programmability. These
designs laid the groundwork for both hardware and software in modern computers.
Legacy: Despite technological limitations that prevented full realization during his lifetime,
Babbage’s work inspired generations of computer scientists and engineers. His analytical
engine, in particular, anticipated the structure of modern computers.
Groundbreaking Insight: Working with Babbage on the Analytical Engine, Ada Lovelace
wrote what is considered the first computer program—an algorithm to compute Bernoulli
numbers. Her notes predicted that computers could manipulate symbols and not just numbers.
Vision Beyond Calculation: Lovelace foresaw that computing machines could create music
and art, establishing the conceptual foundation of computer programming. Her work is
celebrated annually on Ada Lovelace Day as a symbol of women’s contributions to STEM.
The Turing Machine: In 1936, Turing’s conceptualization of the universal Turing machine
provided the first formal definition of algorithmic computation. His work established that any
computable problem could be solved by a machine following a set of instructions.
6
Codebreaking and Beyond: During World War II, Turing’s work at Bletchley Park on the
Enigma code was pivotal in the Allied victory. His subsequent research in artificial
intelligence and theoretical biology has had a lasting influence on modern computer science.
Compiler and High-Level Languages: Grace Hopper’s development of the first compiler,
which translated English-like instructions into machine code, revolutionized programming.
Her work led to the creation of COBOL, a language that made programming more accessible
to a broader audience.
Advocacy for Simplicity: Hopper championed the idea that programming should be
approachable, laying the groundwork for modern software engineering and automated
programming systems.
John Mauchly and J. Presper Eckert: Their work on ENIAC and subsequent machines
demonstrated the practical application of electronic computation on a large scale.
Konrad Zuse: Zuse’s Z3 was one of the first programmable digital computers, and his work
on the Z4 provided one of the earliest commercial computer systems.
Thomas Kurtz and John Kemeny: Pioneers of the BASIC programming language and time-
sharing systems at Dartmouth, which helped democratize computing by making it accessible
to non-experts.
7
Personal Computing and the Internet: The rise of the personal computer and mobile devices,
along with the advent of the Internet, has changed daily life, communication, and global
commerce. The integration of computers into nearly every facet of society marks the
culmination of centuries of technological evolution.
Moore’s Law and Beyond: Advances in semiconductor technology have driven exponential
increases in processing power while reducing costs. Today, the challenge lies not only in
miniaturization but also in energy efficiency and parallel processing (e.g., through chiplets
and 3D stacking).
Artificial Intelligence and Machine Learning: Building on decades of software and hardware
evolution, modern computers now power AI applications that transform industries—from
healthcare to finance. Turing’s early work in AI continues to influence contemporary research
in neural networks and deep learning.
Conclusion
The history of computers is a testament to human ingenuity and the relentless pursuit of
efficiency in problem solving. From the abacus and mechanical calculators of ancient
civilizations to the programmable marvels of the 19th century and the groundbreaking
electronic systems of the 20th century, each stage of development has built on its
predecessors. Pioneers like Charles Babbage, Ada Lovelace, Alan Turing, and Grace Hopper
not only solved immediate challenges but also laid down conceptual frameworks that
continue to guide modern computer science and engineering.
Today’s digital age, characterized by ubiquitous mobile devices and global connectivity, is
the direct result of centuries of innovation. As we continue to push the boundaries of what
machines can do—from developing AI systems to exploring quantum computing—the
legacy of early computing pioneers remains a guiding light, inspiring future generations to
innovate and transform society even further.
8
9