Unit 5
Unit 5
Algorithms
UNIT 5
3. By providing specific rules for comparisons, 3. For example, the Euclidean minimum
the adversary ensures that the algorithm spanning tree problem can be reduced
follows the most time-consuming path. to the element uniqueness problem,
establishing a lower bound of O(n log
4. Adversary arguments are used in algorithm
n).
analysis to establish lower bounds by
showing that no algorithm can perform
better than a certain level of difficulty set
by the adversary By employing these different methods, we can
establish lower bounds for various problems,
providing insights into the inherent complexity
Prepared by M.V.Bhuvaneswari, Asst.Prof, CSE (AI&ML,DS) Dept of algorithmic tasks.
Decision Trees Decision Trees Overview:
1. Decision trees are a visual
representation of how an algorithm
makes decisions based on comparisons
of input elements.
• The height of the decision tree corresponds provides a lower bound on the height
to the maximum number of comparisons (or depth) of binary decision trees.
needed to reach a final state.
• The number of comparisons made by the 3. This inequality implies that the height
algorithm in the worst case is equal to the of a decision tree must be at least log2
height of the decision tree.
of the number of its leaves.
• The height of a binary tree with l leaves is
at least log2(l), as determined by inequality
4. In other words, it provides a
benchmark for assessing the
performance of comparison-based
algorithms: they cannot be more
efficient than this lower bound.
Prepared by M.V.Bhuvaneswari, Asst.Prof, CSE (AI&ML,DS) Dept
Application to Sorting and Searching
Algorithms:
1. Decision trees can be used to analyze
the performance of sorting and
searching algorithms by considering
the number of comparisons they
make.
P NP-Hard
NP-Complete
2. Truncation Errors:
1. When we approximate infinite
processes with finite ones, errors occur
due to truncation.
A 9 2 7 8
B 6 4 3 7
A=1 A=2 A=3 A=4
C 5 8 1 8
D 7 6 2 4
no items have been selected as yet. 2. The total weight and value of the items already
included are 4 and $40, respectively; the value
• Hence, both the total weight of the of the upper bound is 40 + (10 − 4) ∗ 6 = $76.
items already selected w, and their
3. Node 2 represents the subsets that do not
total value v are equal to 0.
include item 1.
• The value of the upper bound 4. Accordingly, w = 0, v = $0, and ub =0+ (10 −
computed by formula (2) is $100. 0) ∗ 6 = $60. Since node 1 has a larger upper
bound than the upper bound of node 2, it is
more promising for this maximization problem,
• The above picture displays and we branch from node 1 first.
the State-space tree of the best-
first branch-and-bound 5. Its children – nodes 3 and 4, represent subsets
algorithm for the instance of the with item 1 and with and without item 2,
respectively.
knapsack problem.
6. Since the total weight w of every subset
represented by node 3 exceeds the knapsack’s
capacity, node 3 can be terminated immediately.
Prepared by M.V.Bhuvaneswari, Asst.Prof, CSE (AI&ML,DS) Dept
Prepared by M.V.Bhuvaneswari, Asst.Prof, CSE (AI&ML,DS) Dept
Traveling Salesman Problem (TSP)
Problem Statement: Given a set of
cities and the distances between each
pair of cities, find the shortest possible
route that visits each city exactly once
Traveling Salesman Problem and returns to the origin city.
Approximating Solutions:
• Fast Algorithms: Instead of exact
solutions, we use fast algorithms to
get approximate solutions.
• Good Enough Solutions: In many
practical applications, an approximate
solution is sufficient.
Approximation Algorithms
Heuristics:
for NP-Hard Problems • Definition: A heuristic is a common-
sense rule or strategy derived from
experience.
Greedy Approach:
1. Nearest Neighbour Algorithm •
Advantages:
Simplicity: Easy to understand and
implement.
• Speed: Runs in O(n2) time, where n is the
number of cities.
Limitations:
• Suboptimal: May not find the shortest
possible tour.
• Greedy Nature: Locally optimal choices
Prepared by M.V.Bhuvaneswari, Asst.Prof, CSE (AI&ML,DS) Dept may lead to a globally suboptimal solution
1
A B
1 6
A B 2
3
3 D C
6 2 1
D C
● Sa : Approximation solution
1 A-B-C-D-A = 1+2+1+6
= length 10
3
1 3
A B
3
3 D C
6 2 1
● S* : Optimal solution
D C A-B-D-C-A = 1+3+1+3
1 = length 8
1 6
A B 2
3
3 D C
6 2 1
D C
● Sa : Approximation solution
1 A-B-C-D-A = 1+2+1+6
= length 10
3
1 3
A B
3
3 D C
6 2 1
● S* : Optimal solution
D C A-B-D-C-A = 1+3+1+3
1 = length 8
Discrete Continuous
Knapsack Knapsack
2 3 $12 4 1 7 $42 6
3 4 $40 10 4 5 $25 5
4 5 $25 5 2 3 $12 4
1 7 $42 6
4 5 $25 5
2 3 $12 4
2 3 $12 4 1 7 $42 6
3 4 $40 10 4 5 $25 5
4 5 $25 5 2 3 $12 4
1 7 $42 6
4 5 $25 5
2 3 $12 4
Total weight = 10
Profit earned = $40+$36 =$76
Approximation Algorithms
for Nonlinear Equations Bisection Method of Newton’s
Method False Position Method
1. Initial Interval: Start with an interval • If f(a)⋅f(c)<0, then the root lies in the
[a,b] where f(a) and f(b) have opposite interval [a,c]. Set b=c.
signs.
• If f(b)⋅f(c)<0, then the root lies in the
2. Midpoint Calculation: Compute the interval [c,b]. Set a=c.
midpoint of the interval: c =
5. Convergence Check: Repeat steps 2-
3. Function Evaluation: Evaluate the 4 until the interval [a,b] is sufficiently
function at the midpoint, f(c). small, or until ∣f(c)∣ is below a predefined
tolerance level.