ADA UNIT IV
ADA UNIT IV
UNIT-IV
GENERALMETHOD
Greedy is the most straight forward design technique. Most of the problems have n
inputs and require us to obtain a subset that satisfies some constraints. Any subset that
satisfies these constraints is called a feasible solution. We need to find a feasible solution
that either maximizes or minimizes the objective function. A feasible solution that does this
is called an optimal solution.
The greedy method is a simple strategy of progressively building up a solution, one
element at a time, by choosing the best possible element at each stage. At each stage, a
decision is made regarding whether or not a particular input is in an optimal solution. This is
done by considering the inputs in an order determined by some selection procedure. If the
inclusion of the next input, into the partially constructed optimal solution will result in an
infeasible solution then this input is not added to the partial solution. The selection
procedure itself is based on some optimization measure. Several optimization measures are
plausible for a given problem. Most of them, however, will result in algorithms that
generate sub-optimal solutions. This version of greedy technique is called subset paradigm.
Some problems like Knapsack, Job sequencing with deadlines and minimum cost spanning
trees are based on subset paradigm.
For the problems that make decisions by considering the inputs in some order, each
decision is made using an optimization criterion that can be computed using decisions
already made. This version of greedy method is ordering paradigm. Some problems like
optimal storage on tapes, optimal merge patterns and single source shortest path are based
on ordering paradigm.
CONTROLABSTRACTION
function.
58
KNAPSACK PROBLEM
Let us apply the greedy method to solve the knapsack problem. We are given ‘n’
objects and a knapsack. The object ‘i’ has a weight w i and the knapsack has a capacity ‘m’.
If a fraction xi, 0 < xi < 1 of object i is placed into the knapsack then a profit of p ixi is
earned. The objective is to fill the knapsack that maximizes the total profit earned.
Since the knapsack capacity is ‘m’, we require the total weight of all chosen objects to be at
most ‘m’. The problem is stated as:
Maximize
subject to
Running time:
The objects are to be sorted into non-decreasing order of pi / wi ratio. But if we disregard
the time to initially sort the objects, the algorithm requires only O(n)time.
Example:
Consider the following instance of the knapsack problem: n = 3, m = 20, (p1, p2, p3) =
(25, 24, 15) and (w1, w2, w3) = (18, 15,10).
59
1. First, we try to fill the knapsack by selecting the objects in some order:
x1 x2 x3 ∑wi xi ∑pi xi
1/2 1/3 1/4 18 x 1/2 + 15 x 1/3 + 10 x1/4 25 x 1/2 + 24 x 1/3 + 15 x 1/4=
=16.5 24.25
2. Select the object with the maximum profit first (p = 25). So, x1 = 1 and profit earned is
25. Now, only 2 units of space is left, select the object with next largest profit (p = 24).
So, x2 =2/15
x1 x2 x3 ∑wi xi ∑pi xi
1 2/15 0 18 x 1 + 15 x 2/15 =20 25 x 1 + 24 x 2/15 =28.2
60
3. Considering the objects in the order of non-decreasing weightswi.
x1 x2 x3 ∑ wi xi ∑ pi xi
0 2/3 1 15 x 2/3 + 10 x 1 =20 24 x 2/3 + 15 x 1 =31
Sort the objects in order of the non-increasing order of the ratio p i / xi. Select the object
with the maximum pi / xi ratio, so, x2 = 1 and profit earned is 24. Now, only 5 units of
space is left, select the object with next largest p i / xi ratio, so x3 = ½ and the profit
earned is7.5.
x1 x2 x3 ∑wi xi ∑pi xi
0 1 1/2 15 x 1 + 10 x 1/2 =20 24 x 1 + 15 x 1/2 =31.5
Example:
Let n=4,(P1,P2,P3,P4,)=(100,10,15,27)and(d1 d2 d3 d4)=(2,1,2,1).The
feasible solutions and their values are:
61
3 1,4 4,1 127 OPTIMA
4 2,3 2,3 25
5 3,4 4,3 42
6 1 1 100
7 2 2 10
8 3 3 15
9 4 4 27
Algorithm:
The algorithm constructs an optimal set J of jobs that can be processed by their deadlines.
Algorithm GreedyJob (d, J,n)
// J is a set of jobs that can be completed by their deadlines.
{
J :={1};
for i := 2 to ndo
{
if (all jobs in J U {i} can be completed by their deadlines) then J
:= J U{i};
}
}
The greedy algorithm is used to obtain an optimal solution.
We must formulate an optimization measure to determine how the next job is chosen.
Algorithm js(d, j, n)
//d dead line, jsubset of jobs ,n total number of jobs
// d[i]≥1 1 ≤ i ≤ n are the dead lines,
// the jobs are ordered such that p[1]≥p[2]≥---≥p[n]
//j[i] is the ith job in the optimal solution 1 ≤ i ≤ k, k subset range
{ d[0]=j[0]=
0; j[1]=1;
k=1;
for i=2 to n do{
r=k;
while((d[j[r]]>d[i]) and [d[j[r]]≠r))
do r=r-1;
if((d[j[r]]≤d[i]) and (d[i]> r)) then
{
for q:=k to (r+1) setp-1 do j[q+1]= j[q];
j[r+1]=i;
k=k+1;
}
}
return k;
}
In the previously studied graphs, the edge labels are called as costs, but here we think
them as lengths. In a labeled graph, the length of the path is defined to be the sum of the
lengths of its edges.
In the single source, all destinations, shortest path problem, we must find a shortest
path from a given source vertex to each of the vertices (called destinations) in the
graph to which there is a path.
Dijkstra’s algorithm is similar to prim's algorithm for finding minimal spanning trees.
Dijkstra’s algorithm takes a labeled graph and a pair of vertices P and Q, and finds the
shortest path between then (or one of the shortest paths) if there is more than one. The
principle of optimality is the basis for Dijkstra’salgorithms.Dijkstra’s algorithm does
not work for negative edges at all.
The figure lists the shortest paths from vertex 1 for a five vertex weighted digraph.
0 1
4 5 1 3
1 2
2 4 5
33 4 3 1 3 4
1
Graph
4 1 2
6 1 3 4 5
Shortest Paths
Algorithm:
Runningtime:
Example1:
the graph:
4
B D
4
3 2 1 2
4 E 1
A C 2 G
6
2 F 1
Status[v] will be either ‘0’, meaning that the shortest path from v to v0 has
definitely been found; or ‘1’, meaning that it hasn’t.
Dist[v] will be a number, representing the length of the shortest path from vto v0
found so far.
Next[v] will be the first vertex on the way to v0 along the shortest path found so far
from v to v0
Step1:
B3 D Vertex A B C D E F G
Status 0 1 1 1 1 1 1
0 6 E Dist. 0 3 6
A G
F Next * A A A A A A
C
Step2:
4 7 D Vertex A B C D E F G
B 3 Status 0 0 1 1 1 1 1
2 Dist. 0 3 5 7
Next * A B B A A A
0 5
E
A G
C
F 64
Step3:
A 3 6 D
Vertex A B C D E F G
A 0 5 Status 0 0 0 1 1 1 1
Dist. 0 3 5 6 9 7
B 9 E G Next * A B C C C A
F 7
Step4:
B 3 7 D Vertex A B C D E F G
Status 0 0 0 0 1 1 1
8
5 E
A 10 G Dist. 0 3 5 6 8 7 10
F Next * A B C D C D
Step5:
B3 6 D Vertex A B C D E F G
Status 0 0 0 0 1 0 1
A 0 5 8 G Dist. 0 3 5 6 8 7 8
C 7 F
Next * A B C D C F
65
Step6:
Vertex A B C D E F G
B3 8 D
Status 0 0 0 0 0 0 1
Dist. 0 3 5 6 8 7 8
E Next * A B C D C F
A 0 5 8 G
C 7 F
Step7:
B3 9 D Vertex A B C D E F G
Status 0 0 0 0 0 0 0
A 0 5 8 G
Dist. 0 3 5 6 8 7 8
C 7 F
Next * A B C D C F
66
67