1st Assign

Download as odt, pdf, or txt
Download as odt, pdf, or txt
You are on page 1of 8

History of Computer

The history of the computer owes its existence to the fact that people, who are lazy by nature, have always sought to improve their ability to calculate, in order to reduce errors and save time. Origins: The abacus The "abacus" was invented in the year 700; it was in use for a long time, and still is in some countries. Then came the logarithm The invention of the logarithm is generally credited to the Scotsman John Napier (1550-1617). In 1614, he showed that multiplication and division could be performed using a series of additions. This discovery led, in 1620, to the invention of the slide rule. However, the true father of logarithm theory is Mohamed Ybn Moussa AlKhawarezmi, an Arab scholar from the Persian town of Khawarezm. This scholar also developed algebra, a term which comes from the Arabic "Al-Jabr", meaning compensation, with the implication being "looking for the unknown variable X in order to compensate by balancing the results of the calculations." The first calculating machines In 1623 German scientist Wilhelm Schikard invented a machine that used 11 complete and 6 incomplete sprocketed wheels that could add, and with the aid of logarithm tables, multiply and divide. French philosopher, mathematician, and physicist Blaise Pascal invented a machine in 1642 that added and subtracted, automatically carrying and borrowing digits from column to column. Pascal built 50 copies of his machine, but most served as curiosities in parlors of the wealthy. Seventeenth-century German mathematician Gottfried Leibniz designed a special gearing system to enable multiplication on Pascals machine. Computer Card Key Punch The IBM 010 punch was one of the first devices designed to perforate cards. A hole or the lack of a hole in a card represented information that could be read by early computers. Modern optical storage devices, such as CD-ROMs, use microscopic pits instead of punched paper holes to store information. THE BETTMANN ARCHIVE/Corbis In the early 19th century French inventor Joseph-Marie Jacquard devised a specialized type of computer: a silk loom. Jacquards loom used punched cards to program patterns that helped the loom create woven fabrics. Although Jacquard was rewarded and admired by French emperor Napoleon I for his work, he fled for his life from the city of Lyon pursued by weavers who feared their jobs were in jeopardy due to Jacquards invention. The loom prevailed, however: When Jacquard died, more than 30,000 of his looms existed in Lyon. The looms are still used today, especially in the manufacture of fine furniture fabrics. Another early mechanical computer was the Difference Engine, designed in the

early 1820s by British mathematician and scientist Charles Babbage. Although never completed by Babbage, the Difference Engine was intended to be a machine with a 20-decimal capacity that could solve mathematical problems. Babbage also made plans for another machine, the Analytical Engine, considered the mechanical precursor of the modern computer. The Analytical Engine was designed to perform all arithmetic operations efficiently; however, Babbages lack of political skills kept him from obtaining the approval and funds to build it. Augusta Ada Byron, countess of Lovelace, was a personal friend and student of Babbage. She was the daughter of the famous poet Lord Byron and one of only a few woman mathematicians of her time. She prepared extensive notes concerning Babbages ideas and the Analytical Engine. Lovelaces conceptual programs for the machine led to the naming of a programming language (Ada) in her honor. Although the Analytical Engine was never built, its key concepts, such as the capacity to store instructions, the use of punched cards as a primitive memory, and the ability to print, can be found in many modern computers. Programmable computers In 1938, Konrad Zuse invented a computer based around electromechanical relays: The Z3. This computer was the first to use binary instead of decimals In 1937, Howard Aiken developed a programmable computer 17 metres long and 2.5 metres high, which could calculate 5 times faster than a human.It was IBM's Mark I. It was built using 3300 gears and 1400 switches linked with 800 km of electrical wiring. In 1947, the Mark II appeared, with its predecessor's gears being replaced by electronic components. Vacuum tube computers In 1942, the ABC (Atanasoff Berry Computer), named after its designers, J.V. Atanasoff and C.Berry, was introduced. In 1943, the first computer with no mechanical parts was created by J.Mauchly and J.Presper Eckert: theENIAC (Electronic Numerical Integrator And Computer). It was made using 18000 vacuum tubes, and took up 1500 m2 of space. It was used for calculations required for designing the H-bomb. The ENIAC's main drawback was its programming:It could only be programmed manually, by flipping switches or plugging in cables. The first computer error was caused by an insect, which was attracted to the vacuum tubes by the heat and became lodged in them, creating a short circuit. Thus, the name "bug" came to mean a computer error. Indeed, as the tubes were poor conductors, they required a great deal of electrical energy, which they released as heat. This problem was solved in 1946 with the creation of the EDVAC (Electronic Discrete Variable Computer), which could store programs in memory (1024 words in central memory and 20000 words in magnetic memory).

The Transistor In 1948, the transistor was created by the firm Bell Labs (thanks to the work of the engineers John Bardeen, Walter Brattain and William Shockley). With transistors, the computers of the 1950s could be made less bulky, less energyhungry, and therefore less expensive: This marked a turning point in computing history. The Integrated Circuit The integrated circuit was perfected in 1958 by Texas Instruments, and made even smaller and cheaper computers possible, by integrating multiple transistors on the same circuit without using electrical wiring. The first Transistor Computers In 1960, the IBM 7000 became the first transistor computer. In 1964, the IBM 360 appeared, along with the DEC PDP-8. Microcomputers In 1971, the first microcomputer came out: the Kenback 1, with a 256-byte memory. Microprocessors In 1971, the first microprocessor, the Intel 4004, appeared. It could carry out 4 bits of operations at once. Around the same time, Hewlett Packard put out the HP-35 calculator. The Intel 8008 processor (which could process 8 bits at a time) was released in 1972. In 1973, The Intel 8080 processor was used in the first microcomputers: the Micral and the Altair 8800, with 256 bytes of memory. In late 1973, Intel came out with processors that were already 10 times faster than their predecessor (the Intel 8080) and included 64 Kb of memory. In 1976, Steve Wozniak and Steve Jobs created the Apple I in a garage. This computer had a keyboard, a 1 MHz microprocessor, 4 Kb of RAM and 1 KB of video memory. The story goes that the two friends didn't know what to name the computer; Steve Jobs, seeing an apple tree in the garden, decided he would call the computer "apple" if he couldn't think up another name in the next five minutes. In 1981, IBM sold the first "PC", made from an 8088 processor with a clock speed of 4.77 MHz. Computers today It is very difficult today to tell where computers are going. Their development has followed Moore's Law: "Every three years, four times as many transistors can be put on a chip." This would imply that there will be 1 billion transistors on a chip around the year 2010.

Charles Babbage the Father of Computer


Charles Babbage, a British Mathematics professor, is regarded as the Father of Computers. He was born in England in 1792 as the son of a rich banker from Devon. He was inspired by Napier's logarithm tables and Napier's logs and bones. He began to design a "difference engine" in 1821 which was a very large and complicated machine intended for doing logarithmic calculations automatically. While Thomas of Colmar was developing the desktop calculator, a series of very interesting developments in computers was started in Cambridge, England, by Charles Babbage, a mathematics professor. In 1812, Babbage realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to do these automatically.He began to design an automatic mechanical calculating machine, which he called a difference engine. By 1822, he had a working model to demonstrate with. With financial help from the British government, Babbage started fabrication of a difference engine in 1823. It was intended to be steam powered and fully automatic, including the printing of the resulting tables, and commanded by a fixed instruction program. The difference engine, although having limited adaptability and applicability, was really a great advance. Babbage continued to work on it for the next 10 years, but in 1833 he lost interest because he thought he had a better idea -- the construction of what would now be called a general purpose, fully programcontrolled, automatic mechanical digital computer. Babbage called this idea an Analytical Engine. The ideas of this design showed a lot of foresight, although this couldnt be appreciated until a full century later. The plans for this engine required an identical decimal computer operating on numbers of 50 decimal digits and having a storage capacity of 1,000 such digits. The built-in operations were supposed to include everything that a modern general - purpose computer would need, even the all important Conditional Control Transfer Capability that would allow commands to be executed in any order, not just the order in which they were programmed. The analytical engine was soon to use punched cards (similar to those used in a Jacquard loom), which would be read into the machine from several different Reading Stations. The machine was supposed to operate automatically, by steam power, and require only one person there. Babbage's computers were never finished. Various reasons are used for his failure. Most used is the lack of precision machining techniques at the time.

Another speculation is that Babbage was working on a solution of a problem that few people in 1840 really needed to solve. After Babbage, there was a temporary loss of interest in automatic digital computers. Between 1850 and 1900 great advances were made in mathematical physics, and it came to be known that most observable dynamic phenomena can be identified by differential equations (which meant that most events occurring in nature can be measured or described in one equation or another), so that easy means for their calculation would be helpful. Moreover, from a practical view, the availability of steam power caused manufacturing, transportation, and commerce to prosper and led to a period of a lot of engineering achievements. The designing of railroads, and the making of steamships, textile mills, and bridges required differential calculus to determine such things as: center of gravity, center of buoyancy, moment of inertia, stress distributions. Even the assessment of the power output of a steam engine needed mathematical integration. A strong need thus developed for a machine that could rapidly perform many repetitive calculations.

Different Types of Computers


A computer is one of the most brilliant inventions of mankind. Thanks to the computer technology, we were able to achieve an efficient storage and processing of data; we could rest our brains by employing computer memory capacities for storage of the information. Owing to computers, we have been able speed up daily work, carry out critical transactions and achieve accuracy and precision in work output. The computers of the earlier years were of the size of a large room and were required to consume huge amounts of electric power. However, with the advancing technology, computers have shrunk to the size of a small watch. Depending on the processing powers and sizes of computers, they have been classified under various types. Let us look at the classification of computers. Different types of Computers Based on the operational principle of computers. Analog Computers: These are almost extinct today. These are different from a digital computer because an analog computer can perform several mathematical operations simultaneously. It uses continuous variables for mathematical operations and utilizes mechanical or electrical energy. Digital Computers: Digital refers to the processes in computers that manipulate binary numbers (0s or 1s), which represent switches that are turned on or off by electrical current. A bit can have the value 0 or the value 1, but nothing in between 0 and 1. Analog refers to circuits or numerical values that have a continuous range. Both 0 and 1 can be represented by analog computers, but so can 0.5, 1.5, or a number like p (approximately 3.14). Hybrid Computers: These computers are a combination of both digital and analog computers. In this type of computers, the digital segments perform

process control by conversion of analog signals to digital ones. Following are some of the other important types of computers. Mainframe Computers: Large organizations use mainframes for highly critical applications such as bulk data processing and ERP. Most of the mainframe computers have the capacities to host multiple operating systems and operate as a number of virtual machines and can thus substitute for several small servers. Microcomputers: A computer with a microprocessor and its central processing unit is known as a microcomputer. They do not occupy space as much as mainframes. When supplemented with a keyboard and a mouse, microcomputers can be called as personal computers. A monitor, a keyboard and other similar input output devices, computer memory in the form of RAM and a power supply unit come packaged in a microcomputer. These computers can fit on desks or tables and serve as the best choices for single-user tasks.

Personal computers come in a variety of forms such as desktops, laptops and personal digital assistants. Desktops: A desktop is intended to be used on a single location. The spare parts of a desktop computer are readily available at relative lower costs. Power consumption is not as critical as that in laptops. Desktops are widely popular for daily use in workplaces and households. Laptops: Similar in operation to desktops, laptop computers are miniaturized and optimized for mobile use. Laptops run on a single battery or an external adapter that charges the computer batteries. They are enabled with an inbuilt keyboard, touch pad acting as a mouse and a liquid crystal display. Its portability and capacity to operate on battery power have served as a boon for mobile users. Personal Digital Assistants (PDAs): It is a handheld computer and popularly known as a palmtop. It has a touch screen and a memory card for storage of data. PDAs can also be effectively used as portable audio players, web browsers and smart phones. Most of them can access the Internet by means of Bluetooth or WiFi communication. Minicomputers: In terms of size and processing capacity, minicomputers lie in between mainframes and microcomputers. Minicomputers are also called midrange systems or workstations. The term began to be popularly used in the 1960s to refer to relatively smaller third generation computers. They took up the space that would be needed for a refrigerator or two and used transistor and core memory technologies. The 12-bit PDP-8 minicomputer of the Digital Equipment Corporation was the first successful minicomputer.

Supercomputers: The highly calculation-intensive tasks can be effectively performed by means of supercomputers. Quantum physics, mechanics, weather forecasting, molecular theory are best studied by means of supercomputers. Their ability of parallel processing and their well-designed memory hierarchy give the supercomputers, large transaction processing powers. Wearable Computers: A record-setting step in the evolution of computers was the creation of wearable computers. These computers can be worn on the body and are often used in the study of behavior modeling and human health. Military and health professionals have incorporated wearable computers into their daily routine, as a part of such studies. When the users hands and sensory organs are engaged in other activities, wearable computers are of great help in tracking human actions. Wearable computers are consistently in operation as they do not have to be turned on and off and are constantly interacting with the user.

Abacus
An abacus is a calculation tool, often constructed as a wooden frame with beads sliding on wires. It was in use centuries before the adoption of the written Hindu-Arabic numeral system and is still widely used by merchants and clerks in China and elsewhere. The origins of the abacus are disputed, suggestions including invention in Babylonia and in China, to have taken place between 2400 BC and 300 BC. The first abacus was almost certainly based on a flat stone covered with sand or dust. Lines were drawn in the sand and pebbles used to aid calculations. From this, a variety of abaci were developed; the most popular were based on the bi-quinary system, using a combination of two bases (base-2 and base-5) to represent decimal numbers. The use of the word abacus dates back to before 1387 when a Middle English work borrowed the word from Latin to describe a sandboard abacus. The Latin word came from abakos, the Greek genitive form of abax ("calculating-table"). Because abax also had the sense of "table sprinkled with sand or dust, used for drawing geometric figures," it is speculated by some linguists that the Greek word may be derived from a Semitic root, abaq, the Hebrew word for "dust." Though details of the transmission are obscure, it may also be derived from the Phoenician word abak, meaning "sand". The plural of abacus is abaci.

Why do I choose my course.? The importance of IT in my course.


I chose Accountancy as my course for the reason that I know when I successfully graduated and passed the board as accountant I know it will give me

great opportunities in this world. Being an accountant is also one of the profession that is always in demand anytime and anywhere. Being an accountant someday will surely give me a better and a brighter future. Information Technology is one of the necessities today, in everything we do IT is surely involved it also helps us a lot in our everyday life. IT also plays a major role in the course I choose and to my future career, Accounting. IT helps an accountant to have information accessible at any time. It also improves the capacity of accounting research and extension specialists to organise , store , retrieve and accounting information exchange. It develops accounting database for easy access and data base decision making. Information is also important in accounting and IT evolves mechanism of information sharing which is important for accountants. Specially accounting data will be available universally with the help of IT. Accountancy is one the most in demand courses these days. Actually I wanted to be a doctor. I studied B.S. Biology at University of the Philippines Baguio but due to some circumstances I transferred here and took up Accountancy. Even though I can't be a doctor, Accountancy can be my first step to be a lawyer. Because of Information technology an accountant's work is made easier. Computer applications like Microsoft word and excel can make recording and computing data easier and convenient.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy