0% found this document useful (0 votes)
2 views13 pages

ADA UNIT IV

Chapter 2 discusses the Greedy Method and Dynamic Programming, focusing on the Greedy Method's general approach to solving optimization problems through feasible and optimal solutions. It covers specific applications such as the Knapsack Problem, Job Sequencing with Deadlines, and Single Source Shortest Paths, detailing algorithms and examples for each. The chapter emphasizes the importance of selection criteria and ordering in achieving optimal solutions using greedy strategies.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views13 pages

ADA UNIT IV

Chapter 2 discusses the Greedy Method and Dynamic Programming, focusing on the Greedy Method's general approach to solving optimization problems through feasible and optimal solutions. It covers specific applications such as the Knapsack Problem, Job Sequencing with Deadlines, and Single Source Shortest Paths, detailing algorithms and examples for each. The chapter emphasizes the importance of selection criteria and ordering in achieving optimal solutions using greedy strategies.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

CHAPTER 2: GREEDY METHOD AND DYNAMIC PROGRAMMING

2.1. Greedy Method


2.1.1. The General Method
2.1.2. Job Sequencing With Deadlines
2.1.3. Knapsack Problem
2.1.4. Minimum Cost Spanning Trees
2.1.5. Single Source Shortest Paths

UNIT-IV

GREEDY METHOD AND DYNAMIC PROGRAMMING

GENERALMETHOD
Greedy is the most straight forward design technique. Most of the problems have n
inputs and require us to obtain a subset that satisfies some constraints. Any subset that
satisfies these constraints is called a feasible solution. We need to find a feasible solution
that either maximizes or minimizes the objective function. A feasible solution that does this
is called an optimal solution.
The greedy method is a simple strategy of progressively building up a solution, one
element at a time, by choosing the best possible element at each stage. At each stage, a
decision is made regarding whether or not a particular input is in an optimal solution. This is
done by considering the inputs in an order determined by some selection procedure. If the
inclusion of the next input, into the partially constructed optimal solution will result in an
infeasible solution then this input is not added to the partial solution. The selection
procedure itself is based on some optimization measure. Several optimization measures are
plausible for a given problem. Most of them, however, will result in algorithms that
generate sub-optimal solutions. This version of greedy technique is called subset paradigm.
Some problems like Knapsack, Job sequencing with deadlines and minimum cost spanning
trees are based on subset paradigm.
For the problems that make decisions by considering the inputs in some order, each
decision is made using an optimization criterion that can be computed using decisions
already made. This version of greedy method is ordering paradigm. Some problems like
optimal storage on tapes, optimal merge patterns and single source shortest path are based
on ordering paradigm.

CONTROLABSTRACTION

Algorithm Greedy (a,n)


// a(1 : n) contains the ‘n’ inputs
{
solution:=ᶲ ; // initialize the solution to be empty
for i:=1 to ndo
{
x := select(a);
if feasible (solution, x)then
solution := Union (Solution,x);
function.
57
}
return solution;
}
Procedure Greedy describes the essential way that a greedy based algorithm will look,
once a particular problem is chosen and the functions select, feasible and union are properly
implemented.
The function select selects an input from ‘a’, removes it and assigns its value to ‘x’.
Feasible is a Boolean valued function, which determines if ‘x’ can be included into the
solution vector. The function Union combines ‘x’ with solution and updates the objective

function.
58
KNAPSACK PROBLEM
Let us apply the greedy method to solve the knapsack problem. We are given ‘n’
objects and a knapsack. The object ‘i’ has a weight w i and the knapsack has a capacity ‘m’.
If a fraction xi, 0 < xi < 1 of object i is placed into the knapsack then a profit of p ixi is
earned. The objective is to fill the knapsack that maximizes the total profit earned.
Since the knapsack capacity is ‘m’, we require the total weight of all chosen objects to be at
most ‘m’. The problem is stated as:

Maximize
subject to

The profits and weights are positive numbers.


Algorithm
If the objects are already been sorted into non-increasing order of p[i] / w[i] then the
algorithm given below obtains solutions corresponding to this strategy.

Algorithm GreedyKnapsack (m,n)


// P[1 : n] and w[1 : n] contain the profits and weights respectively of
// Objects ordered so that p[i] / w[i]> p[i + 1] / w[i + 1].
// m is the knapsack size and x[1: n] is the solution vector.
{
for i := 1 to n do
x[i] :=0.0 ; //initialize the solution vector
U :=m;
for i := 1 to n do
{
if (w(i) > U) then break;
x [i] := 1.0;
U := U –w[i];
}
if (i <n) then x[i] := U /w[i];
}

Running time:
The objects are to be sorted into non-decreasing order of pi / wi ratio. But if we disregard
the time to initially sort the objects, the algorithm requires only O(n)time.

Example:
Consider the following instance of the knapsack problem: n = 3, m = 20, (p1, p2, p3) =
(25, 24, 15) and (w1, w2, w3) = (18, 15,10).

59
1. First, we try to fill the knapsack by selecting the objects in some order:

x1 x2 x3 ∑wi xi ∑pi xi
1/2 1/3 1/4 18 x 1/2 + 15 x 1/3 + 10 x1/4 25 x 1/2 + 24 x 1/3 + 15 x 1/4=
=16.5 24.25

2. Select the object with the maximum profit first (p = 25). So, x1 = 1 and profit earned is
25. Now, only 2 units of space is left, select the object with next largest profit (p = 24).
So, x2 =2/15

x1 x2 x3 ∑wi xi ∑pi xi
1 2/15 0 18 x 1 + 15 x 2/15 =20 25 x 1 + 24 x 2/15 =28.2

60
3. Considering the objects in the order of non-decreasing weightswi.

x1 x2 x3 ∑ wi xi ∑ pi xi
0 2/3 1 15 x 2/3 + 10 x 1 =20 24 x 2/3 + 15 x 1 =31

4. Considered the objects in the order of the ratio pi / wi.

p1/w1 p2/w2 p3/w3


25/18 24/15 15/10
1.4 1.6 1.5

Sort the objects in order of the non-increasing order of the ratio p i / xi. Select the object
with the maximum pi / xi ratio, so, x2 = 1 and profit earned is 24. Now, only 5 units of
space is left, select the object with next largest p i / xi ratio, so x3 = ½ and the profit
earned is7.5.
x1 x2 x3 ∑wi xi ∑pi xi
0 1 1/2 15 x 1 + 10 x 1/2 =20 24 x 1 + 15 x 1/2 =31.5

This solution is the optimal solution.

JOB SEQUENCING WITHDEADLINES


Given a set of ‘n’ jobs. Associated with each Job i, deadline d i >0 and profit Pi >0. For
any job ‘i’ the profit pi is earned iff the job is completed by its deadline. Only one machine is
available for processing jobs. An optimal solution is the feasible solution with maximum
profit.
Sort the jobs in ‘j’ ordered by their deadlines. The array d [1 : n] is used to store the
deadlines of the order of their p-values. The set of jobs j [1 : k] such that j [r], 1 ≤ r ≤ k are the
jobs in ‘j’ and d (j [1]) ≤ d (j[2]) ≤ . . . ≤ d (j[k]). To test whether J U {i} is feasible, we have
just to insert i into J preserving the deadline ordering and then verify that d [J[r]] ≤ r, 1 ≤ r
≤k+1.

Example:
Let n=4,(P1,P2,P3,P4,)=(100,10,15,27)and(d1 d2 d3 d4)=(2,1,2,1).The
feasible solutions and their values are:

Sl.No Feasible Solution Procuring Value Remarks


sequence
1 1,2 2,1 110
2 1,3 1,3 or3,1 115

61
3 1,4 4,1 127 OPTIMA
4 2,3 2,3 25
5 3,4 4,3 42
6 1 1 100
7 2 2 10
8 3 3 15
9 4 4 27

Algorithm:
The algorithm constructs an optimal set J of jobs that can be processed by their deadlines.
Algorithm GreedyJob (d, J,n)
// J is a set of jobs that can be completed by their deadlines.
{
J :={1};
for i := 2 to ndo
{
if (all jobs in J U {i} can be completed by their deadlines) then J
:= J U{i};
}
}
The greedy algorithm is used to obtain an optimal solution.
We must formulate an optimization measure to determine how the next job is chosen.

Algorithm js(d, j, n)
//d dead line, jsubset of jobs ,n total number of jobs
// d[i]≥1 1 ≤ i ≤ n are the dead lines,
// the jobs are ordered such that p[1]≥p[2]≥---≥p[n]
//j[i] is the ith job in the optimal solution 1 ≤ i ≤ k, k subset range
{ d[0]=j[0]=
0; j[1]=1;
k=1;
for i=2 to n do{
r=k;
while((d[j[r]]>d[i]) and [d[j[r]]≠r))
do r=r-1;
if((d[j[r]]≤d[i]) and (d[i]> r)) then
{
for q:=k to (r+1) setp-1 do j[q+1]= j[q];
j[r+1]=i;
k=k+1;
}
}
return k;
}

// dist [v] is set to zero. G is represented by its


62
The Single Source Shortest-Path Problem: DIJKSTRA'SALGORITHMS

In the previously studied graphs, the edge labels are called as costs, but here we think
them as lengths. In a labeled graph, the length of the path is defined to be the sum of the
lengths of its edges.

In the single source, all destinations, shortest path problem, we must find a shortest
path from a given source vertex to each of the vertices (called destinations) in the
graph to which there is a path.

Dijkstra’s algorithm is similar to prim's algorithm for finding minimal spanning trees.
Dijkstra’s algorithm takes a labeled graph and a pair of vertices P and Q, and finds the
shortest path between then (or one of the shortest paths) if there is more than one. The
principle of optimality is the basis for Dijkstra’salgorithms.Dijkstra’s algorithm does
not work for negative edges at all.
The figure lists the shortest paths from vertex 1 for a five vertex weighted digraph.

0 1

4 5 1 3
1 2

2 4 5

33 4 3 1 3 4
1
Graph

4 1 2

6 1 3 4 5

Shortest Paths

Algorithm:

Algorithm Shortest-Paths (v, cost, dist,n)


// dist [j], 1 <j <n, is set to the length of the shortest path
// from vertex v to vertex j in the digraph G with n vertices.
// dist [v] is set to zero. G is represented by its
63
// dist [v] is set to zero. G is represented by its
64
// cost adjacency matrix cost [1:n,1:n].
{
for i :=1 to n do
{
S [i]:=false; //Initialize S.
dist [i] :=cost [v,i];
}
S[v] := true; dist[v] :=0.0; // Put v in S.
for num := 2 to n – 1do
{
Determine n - 1 paths from v.
Choose u from among those vertices not in S such that dist[u] is
minimum; S[u]:=true; // Put u is S.
for (each w adjacent to u with S [w] = false)do
if (dist [w] > (dist [u] + cost [u, w])then //Update distances
dist [w] := dist [u] + cost [u,w];
}
}

Runningtime:

Depends on implementation of data structures fordist.

 Build a structure with nelements A


 at most m = E times decrease the value of anitem mB
 ‘n’ times select the smallestvalue nC
 For array A = O (n); B = O (1); C = O (n) which gives O (n2)total.
 For heap A = O (n); B = O (log n); C = O (log n) which gives O (n + m logn) total.

Example1:
the graph:
4
B D

4
3 2 1 2
4 E 1

A C 2 G
6
2 F 1

The problem is solved by considering the following information:

 Status[v] will be either ‘0’, meaning that the shortest path from v to v0 has
definitely been found; or ‘1’, meaning that it hasn’t.

 Dist[v] will be a number, representing the length of the shortest path from vto v0
found so far.
 Next[v] will be the first vertex on the way to v0 along the shortest path found so far
from v to v0

The progress of Dijkstra’s algorithm on the graph shown above is as follows:

Step1:

B3  D Vertex A B C D E F G

Status 0 1 1 1 1 1 1

0 6  E Dist. 0 3 6    
A G
 F Next * A A A A A A
C

Step2:

4 7 D Vertex A B C D E F G
B 3 Status 0 0 1 1 1 1 1
2 Dist. 0 3 5 7   
Next * A B B A A A
0 5
 E

A G

C 
F 64
Step3:

A 3 6 D
Vertex A B C D E F G
A 0 5 Status 0 0 0 1 1 1 1
Dist. 0 3 5 6 9 7 
B 9 E  G Next * A B C C C A

F 7

Step4:

B 3 7 D Vertex A B C D E F G

Status 0 0 0 0 1 1 1
8
5 E
A 10 G Dist. 0 3 5 6 8 7 10

F Next * A B C D C D

Step5:

B3 6 D Vertex A B C D E F G

Status 0 0 0 0 1 0 1
A 0 5 8 G Dist. 0 3 5 6 8 7 8
C 7 F
Next * A B C D C F

65
Step6:

Vertex A B C D E F G
B3 8 D
Status 0 0 0 0 0 0 1
Dist. 0 3 5 6 8 7 8
E Next * A B C D C F
A 0 5 8 G
C 7 F

Step7:

B3 9 D Vertex A B C D E F G

Status 0 0 0 0 0 0 0
A 0 5 8 G
Dist. 0 3 5 6 8 7 8
C 7 F
Next * A B C D C F

66
67

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy