AMS 103 Lecture Notes
AMS 103 Lecture Notes
Learning Outcomes:
1. Explain the basic concepts of computing and different programmes in computing science.
2. Describe the historical development of computing.
A computer is an electronic device that processes data and performs tasks according to a set of
instructions called a program. It is a versatile machine capable of executing a wide range of
operations, from simple calculations to complex data analysis, communication, and automation.
Computers are designed to accept input, process it, and produce output in a meaningful and useful
way.
1. Input: Computers accept data or instructions from the user through input devices such as
keyboards, mice, scanners, or microphones.
2. Processing: They process the input data using a Central Processing Unit (CPU), which
performs arithmetic, logical, and control operations.
3. Output: After processing, computers provide results or information to the user through
output devices like monitors, printers, or speakers.
4. Storage: Computers can store data and programs for future use using storage devices such
as hard drives, solid-state drives, or cloud storage.
5. Versatility: Computers can perform a wide variety of tasks, from simple calculations to
running complex simulations, depending on the software installed.
6. Speed: They can process millions or even billions of instructions per second, making them
incredibly fast and efficient.
7. Accuracy: Computers perform tasks with high precision, provided the input data and
instructions are correct.
1
8. Automation: They can execute repetitive tasks without human intervention, making them
ideal for automation.
Computing refers to the use of computers and computational systems to process, manage, and
analyze data to solve problems and perform tasks. It encompasses a wide range of activities,
including programming, data analysis, software development, and hardware design. Computing is
an interdisciplinary field that combines principles from mathematics, engineering, and logic to
create systems that can automate tasks, store and retrieve information, and facilitate
communication.
Computing is the foundation of modern technology and plays a critical role in almost every aspect
of life, from business and education to healthcare and entertainment.
The history of computing is a fascinating journey that spans thousands of years. It reflects
humanity's desire to automate calculations, solve complex problems, and process information
efficiently. Below is a detailed exploration of the key eras in the development of computing.
2
Mechanical Era
Electromechanical Era
Electronic Era
1. ENIAC (1945):
o The Electronic Numerical Integrator and Computer (ENIAC) was the first general-
purpose electronic computer.
o It used vacuum tubes to perform calculations and was capable of solving complex
mathematical problems.
o ENIAC was used for military applications, such as calculating artillery firing tables
during World War II.
2. UNIVAC (1951):
o The Universal Automatic Computer (UNIVAC) was the first commercial
computer.
o It was used for business and administrative tasks, such as payroll processing and
inventory management.
o UNIVAC marked the transition of computing from scientific and military
applications to commercial use.
3. Modern Computers:
o The development of transistors, integrated circuits, and microprocessors
revolutionized computing in the latter half of the 20th century.
3
o Modern computers are smaller, faster, and more powerful than their predecessors,
enabling a wide range of applications, from personal computing to artificial
intelligence.
The evolution of computers is often categorized into five generations, each defined by a significant
technological advancement.
Technology: Vacuum tubes were used for circuitry and magnetic drums for memory.
Characteristics:
o Large in size and expensive to operate.
o Consumed a lot of power and generated significant heat.
o Limited processing speed and storage capacity.
Examples: ENIAC, UNIVAC.
Technology: Transistors replaced vacuum tubes, making computers smaller, faster, and
more reliable.
Characteristics:
o Reduced size and power consumption.
o Increased processing speed and storage capacity.
o Introduction of high-level programming languages like FORTRAN and COBOL.
Examples: IBM 1401, CDC 1604.
4
Fifth Generation (Present and Beyond): Artificial Intelligence and Quantum Computing
Computing programmes are specialized areas of study within computing science that focus on
specific applications and technologies. Below are some key computing programmes:
1. Scientific Computing:
o Focuses on solving complex scientific and engineering problems using
computational methods.
o Applications include climate modeling, molecular dynamics, and astrophysics.
2. Business Computing:
o Emphasizes the use of technology to improve business processes and decision-
making.
o Applications include enterprise resource planning (ERP), customer relationship
management (CRM), and e-commerce.
3. Artificial Intelligence (AI):
o Involves creating systems that can perform tasks requiring human intelligence, such
as learning, reasoning, and problem-solving.
o Applications include natural language processing, computer vision, and
autonomous vehicles.
4. Data Science:
o Focuses on extracting insights from large datasets using statistical and
computational techniques.
o Applications include predictive analytics, machine learning, and big data
processing.
Computing has become an integral part of modern society, impacting nearly every aspect of life.
Below are some key areas where computing plays a critical role:
5
o Computing facilitates online learning, collaborative research, and access to vast
amounts of information.
3. Healthcare:
o Computing supports medical diagnostics, patient management, and the
development of new treatments through data analysis and modeling.
4. Government and Public Services:
o Computing improves the efficiency of public services, such as tax collection, law
enforcement, and disaster management.
5. Entertainment and Media:
o Computing powers video games, streaming services, and digital content creation,
transforming the entertainment industry.
Practice Questions
Question 1:
Question 2:
o Trace the historical development of computing from early devices to modern computers.
o Mention at least three early computing devices and their significance.
o Explain the transition from mechanical to electronic computing, highlighting key
inventions like the Analytical Engine and ENIAC.
Question 3:
Question 4:
6
Question 5:
7
Module 2: Hardware, Software, and Humanware
Learning Outcomes:
1. Explain hardware and software and the functional units of a computer.
2. Differentiate between hardware, software, and humanware.
Contents:
2.1 Definition of Hardware, Software, and Humanware
Computing systems are composed of three main components: hardware, software, and humanware.
Each of these plays a critical role in the functionality of a computer system.
Hardware refers to the physical components of a computer system that can be seen and
touched. This includes input devices, output devices, processing units, memory, and
storage components.
Software is a collection of programs and instructions that enable hardware to perform
specific tasks. Software can be categorized into system software, application software, and
utility software.
Humanware represents the human aspect of computing, including users, programmers,
system analysts, and IT support personnel who interact with and maintain the system.
A computer system consists of various functional units that work together to process and manage
information.
Input Devices
Input devices allow users to enter data into a computer system. They are essential for
communication between the user and the computer.
1. Keyboard: The primary text input device, including standard, mechanical, and ergonomic
keyboards.
2. Mouse: A pointing device used to interact with the graphical user interface (GUI). Types
include optical, mechanical, and wireless mice.
3. Scanner: Used to digitize images, documents, and barcodes. Types include flatbed, sheet-
fed, handheld, and drum scanners.
4. Optical Character Recognition (OCR): A technology that converts different types of
documents, such as scanned paper documents, PDFs, or images captured by a digital
camera, into editable and searchable data.
5. Bar Code Reader: A device used to scan and read barcodes, commonly used in retail and
inventory management.
8
6. Ink Reader: Recognizes characters written in ink, often used in digital pens and
handwriting recognition systems.
7. Light Pen: A pointing device that detects light and is used in graphical applications, often
in conjunction with CRT monitors.
Output Devices
1. Monitors:
o CRT (Cathode Ray Tube) Monitors: Older technology with bulky design but
good color accuracy.
o LCD (Liquid Crystal Display) Monitors: Commonly used, offering thin and
lightweight designs.
o LED (Light Emitting Diode) Monitors: Energy-efficient with high contrast and
better display quality.
o OLED (Organic LED) Monitors: Superior contrast and color accuracy, often used
in high-end displays.
2. Printers:
o Inkjet Printers: Spray tiny droplets of ink onto paper, ideal for color printing.
o Laser Printers: Use toner and heat to produce high-speed, high-quality prints.
o Dot Matrix Printers: Impact printers used for multi-copy forms and receipts.
o 3D Printers: Print three-dimensional objects by layering materials.
3. Speakers: Convert digital audio signals into sound, commonly used for entertainment and
communication purposes.
The CPU is the brain of the computer and is responsible for executing instructions. It consists of:
Memory
RAM (Random Access Memory): Volatile memory used for temporary data storage.
ROM (Read-Only Memory): Non-volatile memory that stores firmware and boot
instructions.
Cache Memory: High-speed memory that stores frequently accessed data for quick
retrieval.
9
Storage Devices
Hard Disk Drives (HDDs): Magnetic storage devices with large capacities.
Solid State Drives (SSDs): Faster storage devices using flash memory.
USB Flash Drives: Portable storage devices for easy data transfer.
System Software
System software provides a platform for running application software and managing hardware
components.
Operating Systems (OS): Manage hardware and software resources (e.g., Windows,
macOS, Linux).
Device Drivers: Software that allows the OS to communicate with hardware devices.
Application Software
Word Processors: Software like Microsoft Word used for document creation.
Spreadsheets: Programs like Microsoft Excel for data analysis and calculations.
Web Browsers: Software like Google Chrome and Mozilla Firefox for accessing the
internet.
Utility Software
Antivirus Programs: Protect the system from malware and cyber threats.
Disk Cleanup Tools: Free up storage by removing unnecessary files.
Humanware refers to the people who interact with and manage computer systems. Their roles
include:
User Interaction
End-Users: Individuals who use the computer for various purposes, including business,
education, and entertainment.
Power Users: Individuals with advanced computer skills who utilize specialized software
for complex tasks.
10
System Design and Maintenance
A computer system operates effectively when hardware, software, and humanware work in
harmony.
Hardware provides the physical structure for processing and storing data.
Software enables the execution of tasks and the interaction between users and hardware.
Humanware ensures the efficient use and management of both hardware and software.
Together, these three components form the foundation of modern computing, enabling
advancements in various industries, from healthcare to finance, education, and beyond.
Practice Questions
11
Module 3: Information Processing and Its Role in Society
Learning Outcomes:
Information processing refers to the methods and techniques used to collect, manipulate, store,
retrieve, and disseminate information. It involves a series of actions or steps that transform raw
data into meaningful and useful information. This process is fundamental to the functioning of
modern society, as it underpins decision-making, communication, and the operation of various
systems and technologies.
Information processing can be understood as a cycle that begins with the collection of raw data
and ends with the dissemination of processed information. The cycle is iterative, meaning that the
output of one stage can feed back into the system as input for further processing. This cyclical
nature allows for continuous improvement and refinement of information, making it more
accurate, relevant, and useful over time.
The concept of information processing is not new; it has been a part of human civilization for
centuries. However, with the advent of digital technologies, the scale, speed, and complexity of
information processing have increased exponentially. Today, information processing is a critical
component of various fields, including business, education, healthcare, government, and
entertainment.
Information processing can be broken down into several distinct stages, each of which plays a
crucial role in transforming raw data into actionable information. These stages are:
Data collection is the first stage in the information processing cycle. It involves gathering raw data
from various sources, which can include sensors, surveys, transactions, social media, and more.
The quality and accuracy of the data collected at this stage are critical, as they directly impact the
reliability of the information produced in subsequent stages.
Surveys and Questionnaires: These are commonly used in social science research, market
research, and customer feedback collection. They can be conducted online, via phone, or
in person.
12
Sensors and IoT Devices: In the context of the Internet of Things (IoT), sensors collect
data from the physical environment, such as temperature, humidity, motion, and more. This
data is then transmitted to a central system for processing.
Transaction Records: In business and commerce, transaction records are a rich source of
data. These records can include sales data, inventory levels, customer purchase history, and
more.
Social Media and Web Scraping: Social media platforms and websites are treasure troves
of data. Web scraping tools can be used to extract data from websites, while social media
APIs allow for the collection of data from platforms like Twitter, Facebook, and Instagram.
Public Records and Government Data: Governments and public institutions collect vast
amounts of data, including census data, tax records, health statistics, and more. This data
is often made available to the public and can be used for research and analysis.
The data collected at this stage is often unstructured or semi-structured, meaning it may not be
immediately usable. For example, social media posts, images, and videos are unstructured data
that require further processing to extract meaningful information.
Once data has been collected, it needs to be input into a system for processing. Data input involves
converting raw data into a format that can be understood and processed by a computer or
information system. This stage is crucial because errors in data input can lead to incorrect or
misleading information.
Data input can be done manually or automatically. Manual data input involves human operators
entering data into a system, often using a keyboard or other input device. This method is prone to
errors, especially when large volumes of data are involved. Automatic data input, on the other
hand, involves the use of technology to input data without human intervention. For example,
barcode scanners, RFID readers, and optical character recognition (OCR) systems can
automatically input data into a system.
Data input also involves data validation, which is the process of ensuring that the data entered into
the system is accurate, complete, and consistent. Data validation can include checks for data type,
range, format, and more. For example, a system might check that a date field contains a valid date
or that a numeric field contains only numbers.
Data processing is the stage where raw data is transformed into meaningful information. This
involves a series of operations that manipulate the data to produce the desired output. Data
processing can be done using various techniques, including:
13
Data Transformation: This involves converting data from one format or structure to
another. For example, data might be transformed from a raw text format into a structured
database format.
Data Analysis: This involves applying statistical and mathematical techniques to the data
to identify patterns, trends, and relationships. Data analysis can be descriptive
(summarizing the data), diagnostic (identifying the causes of trends), predictive
(forecasting future trends), or prescriptive (suggesting actions based on the data).
Data Integration: This involves combining data from multiple sources to create a unified
view. Data integration is often used in business intelligence and data warehousing to
provide a comprehensive view of an organization's data.
Data Mining: This involves using machine learning and statistical techniques to discover
patterns and relationships in large datasets. Data mining is often used in marketing, fraud
detection, and scientific research.
Data processing can be done using various tools and technologies, including databases, data
warehouses, data lakes, and big data platforms. These tools provide the infrastructure and
capabilities needed to process large volumes of data efficiently and effectively.
Once data has been processed, it needs to be stored for future use. Data storage involves saving
the processed data in a secure and accessible location. The choice of storage medium and format
depends on the type of data, the volume of data, and the intended use of the data.
Databases: Databases are structured collections of data that are organized in a way that
allows for efficient retrieval and manipulation. Relational databases, such as MySQL and
PostgreSQL, are commonly used for structured data, while NoSQL databases, such as
MongoDB and Cassandra, are used for unstructured or semi-structured data.
Data Warehouses: Data warehouses are specialized databases designed for storing and
analyzing large volumes of historical data. They are often used in business intelligence and
decision support systems.
Data Lakes: Data lakes are storage repositories that store vast amounts of raw data in its
native format. Data lakes are often used in big data applications, where the data is too large
or complex to be stored in a traditional database.
Cloud Storage: Cloud storage involves storing data on remote servers that are accessed
via the internet. Cloud storage providers, such as Amazon Web Services (AWS), Google
Cloud, and Microsoft Azure, offer scalable and cost-effective storage solutions.
Physical Storage: Physical storage involves storing data on physical media, such as hard
drives, solid-state drives (SSDs), and tape drives. Physical storage is often used for backup
and archival purposes.
Data storage also involves data management, which includes tasks such as data backup, data
recovery, data archiving, and data security. Data management is essential to ensure that data is
available when needed and protected from loss or unauthorized access.
14
2.5 Data Output
The final stage in the information processing cycle is data output. This involves presenting the
processed data in a format that is useful and understandable to the end-user. Data output can take
many forms, including:
Reports: Reports are structured documents that present data in a clear and concise manner.
Reports can be generated automatically by information systems and can include text,
tables, charts, and graphs.
Dashboards: Dashboards are visual displays that provide an overview of key metrics and
performance indicators. Dashboards are often used in business intelligence and decision
support systems to provide real-time insights into an organization's performance.
Visualizations: Data visualizations, such as charts, graphs, and maps, are used to present
data in a visual format. Visualizations make it easier to identify patterns, trends, and
relationships in the data.
Alerts and Notifications: Alerts and notifications are used to inform users of important
events or changes in the data. For example, a system might send an alert when a certain
threshold is reached or when an anomaly is detected.
APIs and Data Feeds: APIs (Application Programming Interfaces) and data feeds allow
other systems and applications to access and use the processed data. APIs are often used in
web and mobile applications to provide real-time data to users.
Data output is critical because it determines how effectively the information is communicated to
the end-user. The goal of data output is to present the information in a way that is easy to
understand and actionable.
Information processing plays a vital role in modern society, impacting various sectors and aspects
of daily life. The ability to collect, process, store, and disseminate information has transformed the
way we live, work, and interact with each other. Below, we explore the role of information
processing in several key areas:
In the business world, information processing is essential for decision-making, operations, and
strategy. Businesses rely on information systems to manage their operations, analyze market
trends, and make informed decisions. Some of the key applications of information processing in
business and commerce include:
15
Financial Management: Information processing is used in financial management to track
and analyze financial data, such as revenue, expenses, and profits. This information is used
to make financial decisions, such as budgeting, forecasting, and investment.
Business Intelligence (BI): BI systems use information processing to analyze large
volumes of data and provide insights into business performance. This includes identifying
trends, patterns, and opportunities for growth.
E-commerce: E-commerce platforms rely on information processing to manage online
transactions, track customer behavior, and personalize the shopping experience. This
includes recommending products based on past purchases, managing inventory, and
processing payments.
Information processing has revolutionized education and research by providing new tools and
methods for learning, teaching, and discovery. In education, information processing is used to
manage student records, deliver online courses, and assess student performance. In research,
information processing is used to collect and analyze data, simulate experiments, and share
findings with the global community. Some of the key applications of information processing in
education and research include:
3.3 Healthcare
In healthcare, information processing is critical for patient care, medical research, and public
health. Healthcare providers use information systems to manage patient records, diagnose diseases,
and develop treatment plans. Researchers use information processing to analyze medical data,
develop new treatments, and track the spread of diseases. Some of the key applications of
information processing in healthcare include:
16
Electronic Health Records (EHR): EHR systems use information processing to manage
patient records, including medical history, diagnoses, treatments, and test results. This
information is used to provide personalized care and improve patient outcomes.
Medical Imaging: Information processing is used in medical imaging to analyze images
from X-rays, MRIs, CT scans, and other imaging technologies. This includes using
algorithms to detect abnormalities, measure tissue density, and create 3D models of organs.
Telemedicine: Telemedicine uses information processing to provide remote healthcare
services, including virtual consultations, remote monitoring, and tele-surgery. This
includes using video conferencing, wearable devices, and mobile apps to deliver care to
patients in remote locations.
Clinical Decision Support Systems (CDSS): CDSS use information processing to provide
healthcare providers with evidence-based recommendations for diagnosis and treatment.
This includes analyzing patient data, comparing it to clinical guidelines, and suggesting the
best course of action.
Public Health Surveillance: Information processing is used in public health to track the
spread of diseases, monitor health trends, and respond to outbreaks. This includes
collecting data from hospitals, clinics, and laboratories, and using statistical methods to
analyze the data.
Information processing is essential for the functioning of government and public services.
Governments use information systems to manage public records, deliver services, and make policy
decisions. Public services, such as transportation, utilities, and emergency response, rely on
information processing to operate efficiently and effectively. Some of the key applications of
information processing in government and public services include:
17
surveys, census, and administrative records to inform policy decisions and measure
outcomes.
Information processing has transformed the entertainment and media industry by enabling new
forms of content creation, distribution, and consumption. The industry relies on information
systems to produce, manage, and deliver content to audiences around the world. Some of the key
applications of information processing in entertainment and media include:
While information processing offers numerous benefits, it also presents several challenges that
need to be addressed. These challenges include data security and privacy, information overload,
and ethical concerns. Below, we explore these challenges in detail:
Data security and privacy are critical concerns in information processing. As the volume of data
collected and processed continues to grow, so does the risk of data breaches, cyberattacks, and
unauthorized access. Data security involves protecting data from unauthorized access, use,
disclosure, disruption, modification, or destruction. Data privacy, on the other hand, involves
ensuring that individuals have control over their personal information and how it is used.
18
Some of the key challenges in data security and privacy include:
Information overload occurs when individuals are exposed to more information than they can
effectively process. This can lead to difficulty in making decisions, reduced productivity, and
increased stress. Information overload is a growing concern in the digital age, as the volume of
information available online continues to increase exponentially.
19
Consequences of Information Overload:
Reduced Productivity: Spending too much time sifting through information can lead to
inefficiency.
Decision Fatigue: Too many choices or too much data can make it harder to make
decisions.
Anxiety and Stress: Constant exposure to information, especially negative news, can
lead to mental health issues.
Misinformation: Overload can make it harder to distinguish between accurate and false
information.
Mitigation Strategies:
Filtering: Use tools like algorithms, filters, or curated news feeds to prioritize relevant
information.
Time Management: Allocate specific times for consuming information to avoid constant
distraction.
Critical Thinking: Develop skills to evaluate the credibility and relevance of
information.
Digital Detox: Take breaks from digital devices to reduce mental clutter.
Ethical concerns refer to issues that arise when actions, decisions, or technologies conflict with
moral principles or societal values. In the context of information and technology, ethical
concerns have become increasingly significant as advancements like artificial intelligence, data
collection, and surveillance raise questions about privacy, fairness, and accountability.
Privacy: The collection, storage, and use of personal data by companies and
governments can infringe on individuals' privacy rights. For example, data breaches and
unauthorized surveillance are major concerns.
Bias and Discrimination: Algorithms and AI systems can perpetuate or amplify biases if
they are trained on biased data. This can lead to unfair treatment of certain groups in
areas like hiring, lending, or law enforcement.
Misinformation and Disinformation: The spread of false or misleading information can
have serious consequences, such as influencing elections or public health decisions.
Surveillance: The use of surveillance technologies by governments or corporations can
lead to a loss of freedom and autonomy.
Digital Divide: Unequal access to technology and information can exacerbate social and
economic inequalities.
Autonomy and Consent: Ethical concerns arise when users are not fully informed about
how their data is being used or when they are manipulated by design choices (e.g.,
addictive features in apps).
20
Consequences of Ignoring Ethical Concerns:
Loss of Trust: Companies or institutions that fail to address ethical issues may lose
public trust.
Legal and Financial Risks: Ethical violations can lead to lawsuits, fines, or reputational
damage.
Social Harm: Unethical practices can contribute to societal problems like inequality,
polarization, or discrimination.
Transparency: Organizations should be open about how they collect, use, and share
data.
Accountability: Establish mechanisms to hold individuals and organizations accountable
for unethical behavior.
Regulation: Governments and regulatory bodies can create laws and guidelines to
protect individuals' rights and ensure ethical practices.
Ethical Design: Technology should be designed with ethical considerations in mind,
such as fairness, inclusivity, and user consent.
Education: Raising awareness about ethical issues can empower individuals to make
informed decisions and advocate for change.
Practice Questions:
1. Define Information Processing and explain its significance in modern society. Provide
examples of how it is applied in at least two different sectors.
2. Describe the five stages of information processing (Data Collection, Data Input, Data
Processing, Data Storage, and Data Output).
3. Discuss the role of information processing in the healthcare sector. How does it improve
patient care and operational efficiency? Provide specific examples.
4. What are the major challenges associated with information processing? Explain how data
security and privacy concerns can impact individuals and organizations.
5. How do ethical concerns arise in the context of information processing? Discuss the
potential consequences of unethical practices in data handling and suggest ways to address
these issues.
21
Module 4: Operating Systems and Programme Execution
Learning Outcomes:
An Operating System (OS) is a software that acts as an intermediary between computer hardware
and the user. It manages hardware resources, provides common services for computer programs,
and ensures that different programs and users running at the same time do not interfere with each
other. The OS is the backbone of a computer system, enabling the execution of user applications
and the management of system resources.
Process Management:
The OS is responsible for managing processes, which are instances of programs that are being
executed. It handles process scheduling, ensuring that each process gets a fair share of the CPU
time. The OS also manages process synchronization, communication, and deadlock situations
where processes are waiting indefinitely for resources.
Memory Management:
Memory management involves keeping track of each byte in a computer’s memory and
managing the allocation and deallocation of memory space as needed by processes. The OS
ensures that each process has enough memory to execute and that one process does not interfere
with the memory space of another. Techniques like paging and segmentation are used to
optimize memory usage.
File Management:
The OS provides a way to store, retrieve, and manage files on storage devices. It manages file
systems, which organize files in a hierarchical structure, and handles operations like creating,
deleting, reading, and writing files. The OS also ensures data integrity and security through
permissions and access controls.
Device Management:
Device management involves managing hardware devices through their respective drivers. The
OS handles the communication between the system and devices like printers, disk drives, and
network interfaces. It ensures that devices are used efficiently and that there are no conflicts
between devices.
22
4.3 Types of Operating Systems
Windows: Developed by Microsoft, Windows is one of the most widely used operating
systems for personal computers. It supports a wide range of software and hardware and is
known for its user-friendly interface.
macOS: Developed by Apple, macOS is the operating system for Macintosh computers. It
is known for its stability, security, and seamless integration with other Apple products.
Linux: Linux is an open-source operating system based on Unix. It is highly customizable
and is widely used in servers, supercomputers, and embedded systems. Linux distributions
like Ubuntu, Fedora, and Debian are popular among users.
Android: Developed by Google, Android is the most widely used mobile operating
system. It is based on the Linux kernel and is known for its flexibility and wide range of
applications.
iOS: Developed by Apple, iOS is the operating system for iPhones and iPads. It is known
for its smooth performance, security, and integration with other Apple products.
Monolithic Kernel:
A monolithic kernel is a single large program that contains all the essential parts of an operating
system, including the file system, device drivers, and memory management. It runs in a single
address space and provides high performance due to direct communication between components.
However, it can be less stable and secure because a bug in one part of the kernel can affect the
entire system.
Microkernel:
A microkernel is a minimal version of the kernel that contains only the essential functions needed
to implement an operating system. Other services, like device drivers and file systems, run as user-
space processes. This design improves stability and security because a failure in one service does
not affect the entire system. However, it can be slower due to the overhead of communication
between the kernel and user-space processes.
Hybrid Kernel:
A hybrid kernel combines the features of monolithic and microkernels. It runs some services in
kernel space for performance and others in user space for stability and security. This approach
aims to balance the benefits of both designs, providing good performance while maintaining
system stability.
23
4.5 Programme Execution Process
Compilation: In this process, the source code of a program is translated into machine code
by a compiler. The resulting executable file can be run directly by the computer. Compiled
programs are generally faster because the translation is done before execution.
Interpretation: In this process, the source code is executed line by line by an interpreter.
The interpreter translates each line into machine code at runtime. Interpreted programs are
generally slower but are easier to debug and modify.
Linking: Linking is the process of combining multiple object files and libraries into a
single executable file. The linker resolves references between different modules and
ensures that all external references are correctly resolved.
Loading: Loading is the process of loading the executable file into memory so that it can
be executed. The loader reads the executable file and allocates memory for the program,
setting up the initial state of the program.
Execution: Once the program is loaded into memory, the CPU begins executing the
instructions. The OS manages the execution, ensuring that the program has the necessary
resources and that it does not interfere with other processes.
Termination: When the program finishes execution, it is terminated by the OS. The OS
deallocates the memory and resources used by the program and updates the system state.
Practice Questions
1. Explain the role of the operating system in process management. How does the OS ensure
that each process gets a fair share of the CPU time?
2. Compare and contrast monolithic, microkernel, and hybrid kernel architectures. What are
the advantages and disadvantages of each?
3. Describe the steps involved in the compilation and interpretation of a program. How do
these processes affect the performance and flexibility of the program?
4. Discuss the importance of memory management in an operating system. What techniques
does the OS use to optimize memory usage?
5. What are the key functions of an operating system in terms of file management and device
management? How does the OS ensure data integrity and security?
24
Module 5: Programming and Programme Correctness
Learning Outcomes:
Definition of Programming
Programming is the process of designing, writing, testing, debugging, and maintaining the source
code of computer programs. This source code is written in a programming language, which is a
formal language comprising a set of instructions that produce various kinds of output.
Programming is a key skill in computer science and is used to create software, applications, and
systems that perform specific tasks.
Programming languages can be broadly categorized into three main types based on their
paradigms:
1. Procedural Programming:
o Procedural programming is a programming paradigm that uses a linear or top-down
approach where the program is divided into functions or procedures. Each function
performs a specific task, and the program executes these functions in a sequence.
o Examples: C, Pascal, Fortran.
o Characteristics:
Emphasis on procedure calls.
Use of loops, conditionals, and subroutines.
Data and functions are separate.
2. Object-Oriented Programming (OOP):
o Object-oriented programming is a paradigm based on the concept of "objects,"
which can contain data and code to manipulate that data. OOP focuses on
encapsulating data and behavior into objects, making it easier to model real-world
entities.
o Examples: Java, C++, Python.
o Characteristics:
Encapsulation: Bundling data with methods that operate on that data.
Inheritance: Deriving new classes from existing ones.
Polymorphism: Allowing objects to be treated as instances of their parent
class.
Abstraction: Hiding complex implementation details.
25
3. Functional Programming:
o Functional programming is a paradigm that treats computation as the evaluation of
mathematical functions and avoids changing-state and mutable data. It emphasizes
the application of functions, in contrast to the procedural programming style, which
emphasizes changes in state.
o Examples: Haskell, Lisp, Erlang.
o Characteristics:
Pure functions: Functions that always produce the same output given the
same input and have no side effects.
Immutability: Data is immutable, meaning it cannot be changed after it is
created.
Higher-order functions: Functions that can take other functions as
arguments or return them as results.
Pure Functions
A pure function is a function where the return value is determined only by its input values, without
observable side effects. This means that pure functions do not modify any external state or data.
They are predictable and easier to test and debug.
In this example, the add function is pure because it always returns the same result for the
same inputs and does not modify any external state.
Immutability
Benefits of Immutability:
o Predictability: Since data cannot be changed, it is easier to reason about the
program's behavior.
26
oConcurrency: Immutable data structures are inherently thread-safe, making it
easier to write concurrent programs.
o Debugging: Immutability reduces the risk of bugs caused by unintended side
effects.
Example:
In this example, originalList remains unchanged, and newList is a new list with 0 added
to the beginning.
Higher-Order Functions
Higher-order functions are functions that can take other functions as arguments or return them as
results. This is a powerful feature that allows for abstraction and code reuse.
Example Usage:
27
5.3 Writing Simple Programmes
Syntax: The set of rules that define the structure of a program in a programming language.
It includes the arrangement of symbols, keywords, and operators.
Semantics: The meaning of the syntactically correct statements in a programming
language. It defines what the program does when executed.
let x = 5 -- x is an integer
let y = 3.14 -- y is a float
let z = True -- z is a boolean
let list = [1, 2, 3] -- list is a list of integers
let tuple = (1, "hello") -- tuple is a tuple containing an
integer and a string
let x = 10
let result = if x > 5 then "Greater" else "Smaller"
haskell
28
5.4 Programme Correctness
Debugging Techniques
Print Debugging: Inserting print statements in the code to trace the flow of execution and
the values of variables.
Interactive Debugging: Using an interactive debugger to step through the code, inspect
variables, and evaluate expressions.
Unit Testing: Writing small tests for individual functions or modules to ensure they work
as expected.
Unit Testing: Testing individual units or components of a program to ensure they work
correctly in isolation.
Integration Testing: Testing the interaction between different units or components to
ensure they work together as expected.
Property-Based Testing: Testing the properties of a function rather than specific inputs.
This is common in functional programming and can be done using libraries like
QuickCheck in Haskell.
Syntax Errors: Errors caused by incorrect syntax, such as missing parentheses or incorrect
indentation.
o Fix: Carefully review the code and correct the syntax.
Type Errors: Errors caused by using the wrong data type, such as passing a string to a
function that expects an integer.
o Fix: Ensure that the correct data types are used and consider using type annotations.
Logic Errors: Errors caused by incorrect logic in the code, such as using the wrong
operator or incorrect loop conditions.
o Fix: Review the logic and test the code with different inputs to identify the issue.
Practice Questions
1. Write a pure function in Haskell that takes a list of integers and returns the sum of the
squares of the even numbers in the list.
2. Explain the concept of immutability in functional programming and provide an example of
how it can be beneficial in a concurrent programming environment.
3. Using higher-order functions, write a Haskell function that takes a list of strings and returns
a list of the lengths of those strings.
4. Describe the difference between syntax and semantics in programming languages. Provide
an example of a syntax error and a semantic error in Haskell.
5. Write a Haskell program that uses recursion to calculate the factorial of a given number.
Include a brief explanation of how recursion works in this context.
29
Module 6: Software Applications and the Internet
Learning Outcomes:
Microsoft Word: A powerful word processing application that offers a wide range of
features including templates, spell check, grammar check, and advanced formatting
options. It is part of the Microsoft Office suite and is widely used in businesses and
educational institutions.
Google Docs: A cloud-based word processor that allows for real-time collaboration.
Users can create, edit, and share documents online. It is part of the Google Workspace
and is accessible from any device with an internet connection.
Microsoft Excel: A robust spreadsheet application that offers advanced features such as
pivot tables, data visualization tools, and complex formulas. It is widely used in business
for data analysis and reporting.
Google Sheets: A cloud-based spreadsheet tool that allows for real-time collaboration. It
offers many of the same features as Excel but is accessible from any device with an
internet connection.
30
Graphics and Design Tools (Adobe Photoshop, Canva)
Graphics and design tools are used for creating and editing visual content such as images,
illustrations, and layouts.
Servers: Computers that store and deliver web pages and other content.
Clients: Devices that request and display content from servers.
Routers: Devices that direct data packets between networks.
ISPs (Internet Service Providers): Companies that provide internet access to users.
31
6.3 Online Resources and Tools
Google Chrome: A fast, secure, and user-friendly web browser developed by Google. It
offers a wide range of extensions and integrates well with other Google services.
Mozilla Firefox: An open-source web browser known for its speed, privacy features, and
customization options. It is developed by the Mozilla Foundation.
Microsoft Edge: A web browser developed by Microsoft, based on the Chromium
engine. It offers integration with Microsoft services and features like Collections and
vertical tabs.
Google: The most widely used search engine, known for its comprehensive index and
accurate search results. It uses complex algorithms to rank pages based on relevance and
quality.
Bing: A search engine developed by Microsoft. It offers features like image and video
search, and integrates with other Microsoft services.
Zoom: A video conferencing tool that allows for virtual meetings, webinars, and screen
sharing. It is widely used for remote work, online education, and virtual social gatherings.
Slack: A messaging platform designed for team collaboration. It offers channels for
organized communication, direct messaging, and integration with other tools and
services.
32
Use Strong Passwords: Create complex passwords and change them regularly. Use a
password manager to keep track of them.
Enable Two-Factor Authentication (2FA): Add an extra layer of security by requiring a
second form of verification in addition to your password.
Keep Software Updated: Regularly update your operating system, browsers, and
applications to protect against vulnerabilities.
Be Cautious with Emails and Links: Avoid clicking on suspicious links or
downloading attachments from unknown sources.
Use Antivirus Software: Install and regularly update antivirus software to protect
against malware.
Secure Your Network: Use a strong password for your Wi-Fi network and consider
using a VPN (Virtual Private Network) for added security.
Practice Questions
1. Explain the key features of Microsoft Word and Google Docs. How do they differ in
terms of collaboration and accessibility?
2. Describe the process of data transmission over the internet. What roles do TCP/IP, HTTP,
and FTP play in this process?
3. Compare and contrast the functionalities of Adobe Photoshop and Canva. Which tool
would you recommend for a beginner in graphic design and why?
4. What are the common threats to internet security? Discuss the best practices that
individuals and organizations can adopt to protect themselves online.
5. Discuss the evolution of the internet from its inception to its current state. How has the
development of internet protocols contributed to its growth and widespread use?
33