0% found this document useful (0 votes)
47 views

Iterative Deepening and IDA : Alan Mackworth UBC CS 322 - Search 6 January 21, 2013

So in summary, the time complexity of Iterative Deepening DFS is O(bm) where b is the branching factor and m is the depth of the shallowest solution.

Uploaded by

HayderALMakhzomi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views

Iterative Deepening and IDA : Alan Mackworth UBC CS 322 - Search 6 January 21, 2013

So in summary, the time complexity of Iterative Deepening DFS is O(bm) where b is the branching factor and m is the depth of the shallowest solution.

Uploaded by

HayderALMakhzomi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Iterative Deepening and IDA*

Alan Mackworth

UBC CS 322 – Search 6


January 21, 2013

Textbook § 3.7.3
Lecture Overview

•  Recap from last week

•  Iterative Deepening

Slide 2
Search with Costs
•  Sometimes there are costs associated with arcs.

Def.: The cost of a path is the sum of the costs of its arcs
k
cost ( n0 ,…, nk ) = ∑ cost( ni −1 , ni )
i =1

•  In this setting we often don't just want to find any solution


–  we usually want to find the solution that minimizes cost

Def.: A search algorithm is optimal if


when it finds a solution, it is the best one:
it has the lowest path cost

3
Lowest-Cost-First Search (LCFS)
•  Expands the path with the lowest cost on the frontier.

•  The frontier is implemented as a priority queue ordered by


path cost.
•  How does LCFS differ from Dijkstra’s shortest path
algorithm?
-  The two algorithms are very similar
-  But Dijkstra’s algorithm
-  computes shortest distance from one node to all other nodes
-  works with nodes not with paths
-  stores one bit per node (infeasible for infinite/very large graphs)
-  checks for cycles

4
Heuristic search
Def.:
A search heuristic h(n) is an estimate of the cost of the optimal
(cheapest) path from node n to a goal node.

Estimate: h(n1)
n1

n2 Estimate: h(n2)

n3
Estimate: h(n3)

5
Best-First Search (LCFS)
•  Expands the path with the lowest h value on the frontier.

•  The frontier is implemented as a priority queue ordered by


h.

•  Greedy: expands path that appears to lead to the goal


quickest
-  Can get trapped
-  Can yield arbitrarily poor solutions
-  But with a perfect heuristic, it moves straight to the goal

6
A*
•  Expands the path with the lowest cost + h value on the
frontier

•  The frontier is implemented as a priority queue ordered by


f(p) = cost(p) + h(p)

7
Admissibility of a heuristic
Def.:
Let c(n) denote the cost of the optimal path from node n to any
goal node. A search heuristic h(n) is called
admissible if h(n) ≤ c(n) for all nodes n, i.e. if for all nodes it
is an underestimate of the cost to any goal.

•  E.g. Euclidean distance in routing networks


•  General construction of heuristics: relax the problem,
i.e. ignore some constraints
-  Can only make it easier
-  Saw lots of examples on Wednesday:
Routing network, grid world, 8 puzzle, Infinite Mario

8
Admissibility of A*
•  A* is complete (finds a solution, if one exists) and
optimal (finds the optimal path to a goal) if:

•  the branching factor is finite


•  arc costs are > ε > 0
•  h is admissible.

•  This property of A* is called admissibility of A*

9
Why is A* admissible: complete
If there is a solution, A* finds it:
-  fmin:= cost of optimal solution path s (unknown but finite)

-  Lemmas for prefix pr of s (exercise: prove at home)


-  Has cost f(pr) ≤ fmin (due to admissibility)
-  Always one such pr on the frontier (prove by induction)

-  A* only expands paths with f(p) ≤ fmin


-  Expands paths p with minimal f(p)
-  Always a pr on the frontier, with f(pr) ≤ fmin
-  Terminates when expanding s

-  Number of paths p with cost f(p) ≤ fmin is finite


-  Let cmin > 0 be the minimal cost of any arc
-  k := fmin / cmin. All paths with length > k have cost > fmin
-  Only bk paths of length k. Finite b ⇒ finite
Why is A* admissible: optimal
New Proof (by contradiction)
–  Assume hypothesis (for contradiction):
First solution s’ that A* expands is suboptimal: i.e. cost(s’) > fmin

–  Since s’ is a goal, h(s’) = 0, and f(s’) = cost(s’) > fmin

–  A* selected s’ ⇒ all other paths p on the frontier


had f(p) ≥ f(s’) > fmin

–  But we know that a prefix pr of optimal solution path s is on the


frontier, with f(pr) ≤ fmin
⇒ Contradiction!
–  QED
Summary: any prefix of optimal solution is expanded before suboptimal
solution would be expanded
11
Learning Goals for last week

•  Select the most appropriate algorithms for specific


problems
–  Depth-First Search vs. Breadth-First Search
vs. Least-Cost-First Search vs. Best-First Search vs. A*
•  Define/read/write/trace/debug different search algorithms
-  With/without cost
-  Informed/Uninformed
•  Construct heuristic functions for specific search problems
•  Formally prove A* completeness and optimality
-  Define optimal efficiency

12
Learning Goals for last week, continued
•  Apply basic properties of search algorithms:
–  completeness, optimality, time and space complexity
Complete Optimal Time Space
DFS N N O(bm) O(mb)
(Y if finite & no
cycles)
BFS Y Y O(bm) O(bm)
LCFS Y Y  m)
O(b O(bm)
(when arc costs available) Costs > 0 Costs ≥ 0
Best First N N  m)
O(b O(bm)
(when h available)
A* Y Y  m)
O(b O(bm)
(when arc costs and h Costs > 0
 Costs ≥ 0
available) h admissible h admissible
13
Lecture Overview

•  Recap from last week

•  Iterative Deepening

14
Iterative Deepening DFS (IDS): Motivation
Want low space complexity but completeness and optimality
Key Idea: re-compute elements of the frontier
rather than saving them
Complete Optimal Time Space
DFS N N O(bm) O(mb)
(Y if finite & no
cycles)
BFS Y Y O(bm) O(bm)
LCFS Y Y  m)
O(b O(bm)
(when arc costs available) Costs > 0 Costs ≥ 0
Best First N N  m)
O(b O(bm)
(when h available)
A* Y Y  m)
O(b O(bm)
(when arc costs and h Costs > 0
 Costs ≥ 0
available) h admissible h admissible
15
Iterative Deepening DFS (IDS) in a Nutshell
•  Use DFS to look for solutions at depth 1, then 2, then 3, etc
–  For depth D, ignore any paths with longer length
–  Depth-bounded depth-first search

depth = 1

depth = 2

depth = 3

...
(Time) Complexity of IDS
•  That sounds wasteful!
•  Let’s analyze the time complexity
•  For a solution at depth m with branching factor b

Depth Total # of paths #times created by #times created Total #paths 



at that level BFS (or DFS) by IDS for IDS
1 b 1 m mb
2 b2 1 m-1 (m-1) b2
. . . . .
. . . . .
. . . . .
m-1 bm-1 1 2 2 bm-1
m bm 1 1 bm
17
(Time) Complexity of IDS
Solution at depth m, branching factor b
Total # of paths generated:
bm + 2 bm-1 + 3 bm-2 + ...+ mb
= bm (1 b0 + 2 b-1 + 3 b-2 + ...+ m b1-m )
m m
= b m (∑ ib1−i ) = b m (∑ i (b −1 )i −1 )
i =1 i =1 2 2
m ⎛ b ⎞

m −1 i −1 ⎛ 1 ⎞
≤ b (∑ i (b ) ) = b ⎜ m
−1 ⎟
= b ⎜ m
⎟ ∈ O(b )
i =0 ⎝ 1 − b ⎠ ⎝ b − 1 ⎠

i 1
Geometric progression: for |r|<1: ∑ r =
i =0 1− r
d ∞
i

i −1 1
∑ r = ∑ ir = 2
dr i =0 i =0 (1 − r )
Further Analysis of Iterative Deepening DFS (IDS)
•  Space complexity

O(bm) O(mb) O(bm) O(b+m)


–  DFS scheme, only explore one branch at a time

•  Complete? Yes No

–  Only finite # of paths up to depth m, doesn’t explore longer paths

•  Optimal? Yes No

–  Proof by contradiction

19
Search methods so far
Complete Optimal Time Space
DFS N N O(bm) O(mb)
(Y if finite & no
cycles)
BFS Y Y O(bm) O(bm)
IDS Y Y O(bm) O(mb)
LCFS Y Y  m)
O(b O(bm)
(when arc costs available) Costs > 0 Costs >=0
Best First N N  m)
O(b O(bm)
(when h available)
A* Y Y  m)
O(b O(bm)
(when arc costs and h Costs > 0
 Costs >=0
available) h admissible h admissible

20
(Heuristic) Iterative Deepening: IDA*

•  Like Iterative Deepening DFS


–  But the depth bound is measured in terms of the f value

•  If you don’t find a solution at a given depth


–  Increase the depth bound:
to the minimum of the f-values that exceeded the previous bound

21
Analysis of Iterative Deepening A* (IDA*)
•  Complete and optimal? Same conditions as A*
–  h is admissible
–  all arc costs > ε > 0
–  finite branching factor

 m)
•  Time complexity: O(b

•  Space complexity:

O(bm) O(mb) O(bm) O(b+m)


–  Same argument as for Iterative Deepening DFS

22
Search methods so far
Complete Optimal Time Space
DFS N N O(bm) O(mb)
(Y if no cycles)
BFS Y Y O(bm) O(bm)
IDS Y Y O(bm) O(mb)
LCFS Y Y  m)
O(b O(bm)
(when arc costs available) Costs > 0 Costs >=0
Best First N N  m)
O(b O(bm)
(when h available)
A* Y Y  m)
O(b O(bm)
(when arc costs and h Costs > 0
 Costs >=0
available) h admissible h admissible
IDA* Y (same cond. Y  m)
O(b O(mb)
as A*)
Learning Goals for today’s class
•  Define/read/write/trace/debug different search algorithms
-  New: Iterative Deepening,
Iterative Deepening A*
•  Apply basic properties of search algorithms:
–  completeness, optimality, time and space complexity

Announcements:
–  Practice exercises on course home page
•  Heuristic search
•  Please use these! (Only takes 5 min. if you understood things…)
–  Assignment 1 is out: see Connect.

24

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy