Algorithms Rosen
Algorithms Rosen
Algorithms Rosen
Orders of Growth
Rosen 6th ed., 3.1-3.3
Analysis of Algorithms
An algorithm is a finite set of precise instructions
for performing a computation or for solving a
problem.
What is the goal of analysis of algorithms?
To compare algorithms mainly in terms of running time
but also in terms of other factors (e.g., memory
requirements, programmer's effort etc.)
Example: Searching
Problem of searching an ordered list.
Given a list L of n elements that are sorted into
a definite order (e.g., numeric, alphabetical),
And given a particular element x,
Determine whether x appears in the list, and if
so, return its index (position) in the list.
<x
Overall:
If the number of components is small (say, less
than 20), then Linear Search is faster.
If the number of components is large, then
Binary Search is faster.
8
Example (# of statements)
Algorithm 1
Algorithm 2
arr[0] = 0;
arr[1] = 0;
arr[2] = 0;
...
arr[N-1] = 0;
10
11
Cost
c1
c1
c1
arr[0] = 0;
arr[1] = 0;
arr[2] = 0;
...
arr[N-1] = 0; c1
----------c1+c1+...+c1 = c1 x N
Algorithm 2
Cost
c2
c1
------------(N+1) x c2 + N x c1 =
(c2 + c1) x N + c2
12
15
Example
Suppose you are designing a web site to process
user data (e.g., financial records).
Suppose program A takes fA(n)=30n+8
microseconds to process any n records, while
program B takes fB(n)=n2+1 microseconds to
process the n records.
Which program would you choose, knowing
youll want to support millions of users?
16
On a graph, as
you go to the
right, a faster
growing
function
eventually
becomes
larger...
Value of function
fA(n)=30n+8
fB(n)=n2+1
Increasing n
17
Big-O Notation
We say fA(n)=30n+8 is order n, or O(n).
It is, at most, roughly proportional to n.
fB(n)=n2+1 is order n2, or O(n2). It is, at
most, roughly proportional to n2.
In general, an O(n2) algorithm will be
slower than O(n) algorithm.
Warning: an O(n2) function will grow faster
than an O(n) function.
18
More Examples
We say that n4 + 100n2 + 10n + 50 is of the
order of n4 or O(n4)
We say that 10n3 + 2n2 is O(n3)
We say that n3 - n2 is O(n3)
We say that 10 is O(1),
We say that 1273 is O(1)
19
Big-O Visualization
20
Algorithm 2
Cost
c1
c1
c1
arr[0] = 0;
arr[1] = 0;
arr[2] = 0;
...
arr[N-1] = 0; c1
----------c1+c1+...+c1 = c1 x N
Cost
c2
c1
------------(N+1) x c2 + N x c1 =
(c2 + c1) x N + c2
O(n)
21
c1
c2
c2
c3
for-loop
23
Examples
i = 0;
while (i<N) {
X=X+Y;
// O(1)
result = mystery(X); // O(N), just an example...
i++;
// O(1)
}
Examples (cont.d)
if (i<j)
for ( i=0; i<N; i++ )
X = X+i;
else
O(1)
X=0;
O(N)
Asymptotic Notation
O notation: asymptotic less than:
f(n)=O(g(n)) implies: f(n) g(n)
26
Big-O Visualization
k
28
Value of function
30n+8
30n+8
O(n)
n>k=8
Increasing n
31
32
33
Order-of-Growth in Expressions
O(f) can be used as a term in an arithmetic
expression .
E.g.: we can write x2+x+1 as x2+O(x) meaning
x2 plus some function that is O(x).
Formally, you can think of any such expression as
denoting a set of functions:
x2+O(x) : {g | fO(x): g(x)= x2+f(x)}
34
36
af = O(f)
f+O(f) = O(f)
|f|1-b = O(f)
(logb |f|)a = O(f)
g=O(fg)
fg O(g)
a=O(f)
Big- Visualization
39
Big- Visualization
41
Rules for
Mostly like rules for O( ), except:
f,g>0 & constants a,bR, with b>0,
af (f)
Same as with O.
f (fg) unless g=(1) Unlike O.
|f| 1-b (f), and
Unlike with O.
(logb |f|)c (f).
Unlike with O.
The functions in the latter two cases we say
are strictly of lower order than (f).
42
example
Determine whether:
Quick solution:
i n(n 1) / 2
i 1
n (n) / 2
n (n)
2
i
(
n
)
i 1
( n )
2
43
o( f )
( f )
( f )
45
f g lim
x
g ( x)
0.
47
49
Algorithmic Complexity
The algorithmic complexity of a
computation is some measure of how
difficult it is to perform the computation.
Measures some aspect of cost of
computation (in a general sense of cost).
50
Problem Complexity
The complexity of a computational problem
or task is the complexity of the algorithm
with the lowest order of growth of
complexity for solving that problem or
performing that task.
E.g. the problem of searching an ordered list
has at most logarithmic time complexity.
(Complexity is O(log n).)
51
53
Unsolvable problems
It can be shown that there exist problems that no
algorithm exists for solving them.
Turing discovered in the 1930s that there are
problems unsolvable by any algorithm.
Example: the halting problem (see page 176)
Given an arbitrary algorithm and its input, will that
algorithm eventually halt, or will it continue forever in
an infinite loop?
54
NP and NP-complete
NP is the set of problems for which there exists a
tractable algorithm for checking solutions to see if
they are correct.
NP-complete is a class of problems with the
property that if any one of them can be solved by a
polynomial worst-case algorithm, then all of them
can be solved by polynomial worst-case
algorithms.
Satisfiability problem: find an assignment of truth
values that makes a compound proposition true.
55
P vs. NP
We know PNP, but the most famous
unproven conjecture in computer science is
that this inclusion is proper (i.e., that PNP
rather than P=NP).
It is generally accepted that no NPcomplete problem can be solved in
polynomial time.
Whoever first proves it will be famous!
56
Questions
Find the best big-O notation to describe the
complexity of following algorithms:
A binary search of n elements
A linear search to find the smallest number in a
list of n numbers
An algorithm that lists all ways to put the
numbers 1,2,3,,n in a row.
57
Questions (contd)
Questions (contd)
The wors-case analysis of a linear search of a list
of size n
The number of print statements in the following
while n>1 {
print hello
n=n/2
}
59
Questions (contd)
The number of print statements in the following
for (i=1, in; i++)
for (j=1, j n; j++)
print hello
The number of print statements in the following
for (i=1, in; i++)
for (j=1, j i; j++)
print hello
60
Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.
Alternative Proxies: