Topic 1 Introduction To Agorithms
Topic 1 Introduction To Agorithms
Topic 1 Introduction To Agorithms
&
COMPLEXIT Y
I. INTRODUCTION
I. WHAT IS ALGORITHM?
II. FUNDAMENTALS OF ALGORITHMIC PROBLEM SOLVING
I. Understanding the Problem
II. Ascertaining the Capabilities of the Computational Device
III. Choosing between Exact and Approximate Problem Solving
IV. Algorithm Design Techniques
V. Designing an Algorithm and Data Structures
VI. Methods of Specifying an Algorithm
VII. Proving an Algorithm’s Correctness
VIII. Analyzing an Algorithm
IX. Coding an Algorithm
INTRODUCTION
Algorithmics is more than a branch of computer science. It
is the core of computer science, and, in all fairness, can be
said to be relevant to most of science, business, and
technology. [Har92, p. 6]
WHAT IS AN ALGORITHM?
• From a practical perspective, the first thing you need to do before designing an algorithm is to
understand completely the problem given. Read the problem’s description carefully and ask
questions if you have any doubts about the problem, do a few small examples by hand, think
about special cases, and ask questions again if needed.
• An input to an algorithm specifies an instance of the problem the algorithm solves. It is very
important to specify exactly the set of instances the algorithm needs to handle.
• Your algorithm may work correctly for a majority of inputs but crash on some “boundary” value.
Remember that a correct algorithm is not one that works most of the time, but one that works
correctly for all legitimate inputs.
II. ASCERTAINING THE CAPABILITIES
OF THE COMPUTATIONAL DEVICE
• Once you completely understand a problem, you need to ascertain the capabilities of the
computational device the algorithm is intended for.
• algorithms in use today are still destined to be programmed for a computer closely resembling
the von Neumann machine—a computer architecture outlined by the prominent Hungarian-
American mathematician John von Neumann (1903–1957), in collaboration with A. Burks and
H. Goldstine, in 1946.
• Sequential Algorithms are designed for Von Neumann Machines
• The central assumption of the RAM model does not hold for some newer computers that can
execute operations concurrently, i.e., in parallel. Algorithms that take advantage of this
capability are called parallel algorithms.
• If you are designing an algorithm as a practical tool, the answer may depend on a problem you
need to solve. Even the “slow” computers of today are almost unimaginably fast.
• Consequently, in many situations you need not worry about a computer being too slow for the
task.
• There are important problems, however, that are very complex by their nature, or have to
process huge volumes of data, or deal with applications where the time is critical.
• In such situations, it is imperative to be aware of the speed and memory available on a
particular computer system.
III.CHOOSING BETWEEN EXACT AND
APPROXIMATE PROBLEM SOLVING
• Why would one opt for an approximation algorithm?
– there are important problems that simply cannot be solved exactly for most of their instances;
– available algorithms for solving a problem exactly can be unacceptably slow because of the
problem’s intrinsic complexity.
– an approximation algorithm can be a part of a more sophisticated algorithm that solves a problem
exactly.
IV. ALGORITHM DESIGN TECHNIQUES
Every science is interested in classifying its principal subject, and computer science is no exception.
Algorithm design techniques make it possible to classify algorithms according to an underlying design
idea; therefore, they can serve as a natural way to both categorize and study algorithms.
V. DESIGNING AN ALGORITHM AND
DATA STRUCTURES
• one should pay close attention to choosing data structures appropriate for the operations
performed by the algorithm.
• Many years ago, an influential textbook proclaimed the fundamental importance of both
algorithms and data structures for computer programming by its very title:
Algorithms+ Data Structures = Programs [Wir76].
In the new world of object-oriented programming, data structures remain crucially important for
both design and analysis of algorithms.
VI. METHODS OF SPECIFYING AN
ALGORITHM
• Using a natural language has an obvious appeal; however,
the inherent ambiguity of any natural language makes a
succinct and clear description of algorithms surprisingly
difficult. Nevertheless, being able to do this is an important
skill that you should strive to develop in the process of
learning algorithms.
• Pseudocode is a mixture of a natural language and
programming language like constructs.
• Pseudocode is usually more precise than natural
language, and its usage often yields more succinct
algorithm descriptions.
• In the earlier days of computing, the dominant
vehicle for specifying algorithms was a flowchart, a
method of expressing an algorithm by a collection of
connected geometric shapes containing descriptions
of the algorithm’s steps. This representation
technique has proved to be inconvenient for all but
very simple algorithms;
VII. PROVING AN ALGORITHM’S
CORRECTNESS
• The notion of correctness for approximation
algorithms is less straightforward than it is for exact
algorithms.
• For an approximation algorithm, we usually would
like to be able to show that the error produced by the
algorithm does not exceed a predefined limit.
VIII. ANALYZING AN ALGORITHM
• DATA STRUCTURE
• TECHNIQUES
• HARD PROBLEMS
• PARRALLISM
ALGORITHMS AS A TECHNOLOGY
• EFFICIENCY
– Different algorithms devised to solve the same problem
often differ dramatically in their efficiency. These
differences can be much more significant than
differences due to hardware and software.
CRITERIA OF EFFICIENCY
(COMPUTING)
•TIME COMPLEXITY
•SPACE COMPLEXITY
Time complexity ≠ Space complexity ≠ Complexity of algorithm
HOW CAN WE MEASURE COMPLEXITY
EMPIRICAL ANALYSIS(BENCHMARK)
THEORITICAL ANALYSIS(ASYMPTOTIC
ANALYSIS)
BENCHMARK
(EMPIRICAL ANALYSIS)
BENCHMARK
VERSION #1
Asymptotic by definition is a line that approaches a curve but never touches. A curve
and a line that get closer but do not intersect are examples of a curve and a line that
are asymptotic to each other.
n→∞
• LINEAR SEARCH
BEST, WORST, AND AVERAGE-CASE COMPLEXITY
BEST, WORST, AND AVERAGE-CASE COMPLEXITY
• LINEAR SEARCH
HOW CAN WE COMPARE TWO
FUNTIONS?
• Use Asymptotic Notation
O(g(n)) = {f(n): there exist positive constants c and n 0 such that 0 ≤ f(n) ≤ c⋅g(n) for all n ≥ n0}
T(n) ∈ O(g(n))
or
T(n) = O(g(n))
Ω-NOTATION
(ASYMPTOTIC LOWER BOUND)
Ω(g(n)) = {f(n): there exist positive constants c and n 0 such that 0 ≤ c⋅g(n) ≤ f(n) for all n ≥ n0}
T(n) ∈ Ω(g(n))
or
T(n) = Ω(g(n))
TYPES OF ORDER
THE BIG OH COMPLEXITY FOR DIFFERENT FUNCTIONS
O T H E R A S Y M P T O T I C N O TAT I O N S
notation definition Mathematical definition
c,c1,c2 > 0 and n ≥ n0
little-o f(n) o(g(n)) upper bound that is not f(n) < cg(n)
asymptotically tight
little- f(n) (g(n)) lower bound that is not f(n) > cg(n)
asymptotically tight
Note: Big-O is commonly used where Θ is meant, i.e., when a tight estimate is implied
• “Total system performance depends on
choosing efficient algorithms as much as on
choosing fast hardware. Just as rapid advances
are being made in other computer technologies,
they are being made in algorithms as well.”
EXERCISE
• Give an example of an application that requires algorithmic content at the application level, and
discuss the function of the algorithms involved.
SOURCES
• https://github.com/Xxa/talks
• Introduction to design and Analysis of Algorithms by
Anany Levitin
• Introduction to Algorithms 3rd Edition by Cormen T. et al.