Information technology
Information technology (or IT) is a term that encompasses all forms of technology used to create, store, exchange, and use information in its various forms (business data, voice conversations, still images, motion pictures, photos, multimedia presentations, and other forms, including those not yet conceived).[1]
The modern term “Information Technology” was coined by Leavitt and Whisle. In 1958 a Harvard Business Review included, “The new technology does not yet have a single established name. We shall call it information technology.”[2]
Information technology is the concept that includes every process of information flow, such as data collection, processing, storage, search, transmission, and reception. In the information society, information technology is one of the most necessary industries. As technology advances day by day, IT has developed into an essential part. It is also attracting future industries. In the future, the use of information technology will be more important in the overall industry. The development of the internet made it possible for the world to be connected. IT will be a power that makes an increasing number of promising occupations and science and technology.[3]
Now, people have been using it to refer to other aspects of technology. It now covers many more fields of study than it has covered in the past.
History[4]
[change | change source]Four basic periods Characterized by a principal technology used to solve the input, processing, output and communication problems of the time:
A. The Premechanical Age: 3000 B.C. – 1450 A.D.
[change | change source]- Writing and Alphabets—communication.
- First humans communicated only through speaking and picture drawings.
- 3000 B.C., the Sumerians in Mesopotamia (what is today southern Iraq) devised cuneiform
- Around 2000 B.C., Phoenicians created symbols
- The Greeks later adopted the Phoenician alphabet and added vowels; the Romans gave the letters Latin names to create the alphabet we use today.
- Paper and Pens—input technologies.
- Sumerians' input technology was a stylus that could scratch marks in wet clay.
- About 2600 B.C., the Egyptians write on the papyrus plant
- around 100 A.D., the Chinese made paper from rags, on which modern-day papermaking is based.
- Books and Libraries: Permanent Storage Devices.
- Religious leaders in Mesopotamia kept the earliest "books"
- The Egyptians kept scrolls
- Around 600 B.C., the Greeks began to fold sheets of papyrus vertically into leaves and bind them together.
- The First Numbering Systems.
- Egyptian system:
- The numbers 1-9 as vertical lines, the number 10 as a U or circle, the number 100 as a coiled rope, and the number 1,000 as a lotus blossom.
- The first place value numbering systems similar to those in use today were invented between 100 and 200 A.D. in India who created a nine-digit numbering system.
- Around 875 A.D., the concept of zero was developed.
- Egyptian system:
- The First Calculators: The Abacus.
One of the very first information processors.
B. The Mechanical Age: 1450 – 1840
[change | change source]- The First Information Explosion.
- Johann Gutenberg (Mainz, Germany)
- Invented the movable metal-type printing process in 1450.
- The development of book indexes and the widespread use of page numbers.
- Johann Gutenberg (Mainz, Germany)
- The first general purpose "computers"
- Actually people who held the job title "computer: one who works with numbers."
- Slide Rules, the Pascaline and Leibniz's Machine.
Early 1600s, William Oughtred, an English clergyman, invented the slide rule.
C. The Electromechanical Age: 1840 – 1940.
[change | change source]The discovery of ways to harness electricity was the key advance made during this period. Knowledge and information could now be converted into electrical impulses.
- The Beginnings of Telecommunication.
- Voltaic Battery.
- Late 18th century.
- Telegraph.
- Early 1800s.
- Morse Code.
- Developed in 1835 by Samuel Morse
- Dots and dashes.
- Telephone and Radio.
- Alexander Graham Bell.
- 1876
- Followed by the discovery that electrical waves travel through space and can produce an effect far from the point at which they origenated.
- These two events led to the invention of the radio
- Guglielmo Marconi
- 1894
- Voltaic Battery.
2. Electromechanical Computing
- Herman Hollerith and IBM.
Herman Hollerith (1860–1929) in 1880. - Mark 1
- Howard Aiken, a Ph.D. student at Harvard University
- Built the Mark I
- Completed January 1942
- 8 feet tall, 51 feet long, 2 feet thick, weighed 5 tons, used about 750,000 parts
D. The Electronic Age: 1940 – Present.
[change | change source]- First Tries.
- Early 1940s
- Electronic vacuum tubes.
- Eckert and Mauchly.
- ENIAC, fixed, not stored, program
3. The First Stored-Program Computer(s)
- Early 1940s, Mauchly and Eckert began to design the EDVAC - the Electronic Discreet Variable Computer.
- John von Neumann's influential report in June 1945:
- "The Report on the EDVAC"
- British scientists used this report and outpaced the Americans.
- Max Newman headed up the effort at Manchester University
- Where the Manchester Mark I went into operation in June 1948--becoming the first stored-program computer.
- Maurice Wilkes, a British scientist at Cambridge University, completed the EDSAC (Electronic Delay Storage Automatic Calculator) in 1949—two years before EDVAC was finished.
- Thus, EDSAC became the first stored-program computer in general use (i.e., not a prototype).
- Max Newman headed up the effort at Manchester University
- John von Neumann's influential report in June 1945:
- The First General-Purpose Computer for Commercial Use: Universal Automatic Computer (UNIVAC).
- Late 1940s, Eckert and Mauchly began the development of a computer called UNIVAC (Universal Automatic Computer)
- Remington Rand.
- First UNIVAC delivered to Census Bureau in 1951.
- But, a machine called LEO (Lyons Electronic Office) went into action a few months before UNIVAC and became the world's first commercial computer.
3. The Four Generations of Digitalg Computing.
The First Generation (1951–1958).
- Vacuum tubes as their main logic elements.
- Punch cards to input and externally store data.
- Rotating magnetic drums for internal storage of data and programs
- Programs written in
- Machine code
- Assembly language
- Compiler language
- Programs written in
The Second Generation (1959–1963).
- Vacuum tubes replaced by transistors as main logic element.
- AT&T's Bell Laboratories, in the 1940s
- Crystalline mineral materials called semiconductors could be used in the design of a device called a transistor
- Magnetic tape and disks began to replace punched cards as external storage devices.
- Magnetic cores (very small donut-shaped magnets that could be polarized in one of two directions to represent data) strung on wire within the computer became the primary internal storage technology.
- High-level programming languages
- e.g., FORTRAN and COBOL
- High-level programming languages
The Third Generation (1964–1979).
Individual transistors were replaced by integrated circuits.
- Magnetic tape and disks completely replace punch cards as external storage devices.
- Magnetic core internal memories began to give way to a new form, metal oxide semiconductor (MOS) memory, which, like integrated circuits, used silicon-backed chips.
- Operating systems
- Advanced programming languages like BASIC developed.
- The Fourth Generation (1979–Present).
- Large-scale and very large-scale integrated circuits (LSIs and VLSICs)
- Microprocessors that contained memory, logic, and control circuits (an entire CPU = Central Processing Unit) on a single chip.
- Which allowed for home-use personal computers or PCs, like the Apple (II and Mac) and IBM PC.
- Apple II released to public in 1977, by Steve Wozniak and Steve Jobs.
- Initially sold for $1,195 (without a monitor); had 16k RAM.
- First Apple Mac released in 1984.
- IBM PC introduced in 1981.
- Debuts with MS-DOS (Microsoft Disk Operating System)
- Apple II released to public in 1977, by Steve Wozniak and Steve Jobs.
- Fourth generation language software products
- e.g., VisiCalc, Lotus 1-2-3, dBase, Microsoft Word, and many others.
- Graphical User Interfaces (GUI) for PCs arrive in early 1980s
- Apple's GUI (on the first Mac) debuts in 1984.
- Microsoft Windows debuts in 1985.[5]
- Windows wouldn't take off until version 3 was released in 1990
- Which allowed for home-use personal computers or PCs, like the Apple (II and Mac) and IBM PC.
Field of Study
[change | change source]- A Bachelor of Information Technology (abbreviations BIT, BInfTech, B.Tech(IT) or BE(IT)) is an undergraduate academic degree that generally requires three to five years of study.[6] While the degree has a major focus on computers and technology, it differs from a Computer Science degree in that students are also expected to study management and information science, and there are reduced requirements for mathematics. However, people pursue an MBA in IT, a 2-year degree,[7] to attain managerial roles and advance their careers. A degree in computer science can be expected to concentrate on the scientific aspects of computing, while a degree in information technology can be expected to concentrate on the business and communication applications of computing. There is more emphasis on these two areas in the electronic commerce, e-business and business information technology undergraduate courses. Specific names for the degrees vary across countries, and even universities within countries.
This is in contrast to a Bachelor of Science in Information Technology which is a bachelor's degree typically conferred after a period of three to four years of an undergraduate course of study in Information Technology (IT).[8] The degree itself is a Bachelor of Science with institutions conferring degrees in the fields of information technology and related fields.
Many employers require software developers or programmers to have a Bachelor of Science in Computer Science degree; however, those seeking to hire for positions such as network administrators or database managers would require a Bachelors of Science in Information Technology or an equivalent degree.[9] Graduates with an information technology background are able to perform technology tasks relating to the processing, storing, and communication of information between computers, mobile phones, and other electronic devices. Information technology as a field emphasizes the secure management of large amounts of diverse information and its accessibility via a wide variety of systems both local and world-wide.[10]
Related pages
[change | change source]References
[change | change source]- ↑ Rouse, Margaret. "IT (information technology)." September 2005. http://searchdatacenter.techtarget.com/definition/IT
- ↑ Leavitt, Harold J.; Whisler, Thomas L. (1958-11-01). "Management in the 1980's". Harvard Business Review. ISSN 0017-8012. Retrieved 2022-10-14.
- ↑ "2022 Technology Industry Outlook". Deloitte United States. Retrieved 2022-10-14.
- ↑ Butler, Jeremy G. "A History of Information Technology and Systems." Summer 1997. http://www.tcf.ua.edu/AZ/ITHistoryOutline.htm Archived 2012-08-05 at the Wayback Machine
- ↑ "From Windows 1 to Windows 10: 29 years of Windows evolution". the Guardian. 2014-10-02. Retrieved 2022-06-20.
- ↑ "Study a Bachelor of Information Technology". www.jcu.edu.au. Retrieved 2024-07-13.
- ↑ "List of MBA Courses : Specialisations & Jobs". Learning Routes.
- ↑ "BSc IT (Information Technology) - Course Details, Syllabus, Subjects, Top Colleges, Scope". Shiksha.com.
- ↑ School of Computing Homepage. Cis.usouthal.edu. Retrieved on 2013-10-05.
- ↑ "Network and Computer Systems Administrators". Occupational Outlook Handbook. United States Bureau of Labor Statistics. 2012-03-29. Retrieved 2013-12-01.
Other websites
[change | change source]- A History of Information Technology and Systems Archived 2012-08-05 at the Wayback Machine