An Introduction To Optimization: - Classification and Case Study

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Lecture 1

An Introduction to Optimization
Classification and Case Study

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/1

Un-constrained Optimization
In general, an optimization
problem has an objective
function f(x).
The problem is
min (max) f(x), for all x
This is called an unconstrained optimization
problem.
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/2

Un-constrained Optimization
Usually, x is an N-dimensional real vector, and
the problem domain is RN.
In general, f(x) is an M-dimensional real vector,
and the range of f(x) is RM.
In this course, we study only the case in which
M=1. That is, we have only one objective to
optimize.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/3

Constrained Optimization
The domain can be a sub-space D of RN.
We have constrained optimization problem:

min (max) f ( x)

Subject to

s.t. x D
D again can be defined by some functions
xi>0, i=1,2,
gj(x) >0, j=1,2,
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/4

Linear programming
()
If both f(x) and gj(x) are linear functions, we
have linear optimization problem, and this is
usually called linear programming (LP).
For LP, we have very efficient algorithms
already, and meta-heuristics are not needed.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/5

Non-linear programming
()
If f(x) or any gj(x) is non-linear, we have nonlinear optimization problem, and this is often
called non-linear programming (NLP).
Many methods have been proposed to solve
this class of problems. However, conventional
methods usually finds local optimal solutions.
Meta-heuristic methods are useful for finding
global solutions.
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/6

Local optimal and global optimal


For minimization problem,

A solution x* is local optimal if f(x*)<f(x) for all x in the neighborhood of x*, where >0 is a real number, and is the
radius of the neighborhood.
A solution x* is global optimal if f(x*)<f(x) for all x in the
search space (problem domain).

To obtain global optimal solutions efficiently for nonlinear programming, meta-heuristics can be used.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/7

Example 1: Linear Programming


Material used in Product1 Material used in Product2
Material 1

Material 2

2 materials are used for making two products.


The prices of the products are 25 and 31 (in million yen), and those
of the materials are 0.5 and 0.8 (in million yen).
Suppose that we produce x1 units for product1, and x2 units for
product2.
We can get 25*x1+31*x2 million yen by selling the products.
On the other hand, we must pay (7*x1+5*x2)*0.5 +
(4*x1+8*x2)*0.8 million yen to buy the materials.
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/8

Example 1: Linear Programming


The problem can be formulated as follows:
max f(x1,x2)=18.3x1 + 22.1x2
s. t. x1>0; x2>0;
6.7x1 + 8.9x2 < B
The first set of constraints means that both products should
be produced to satisfy social needs; and the second
constraint is the budget limitation.
This is a typical linear programming problem, and can be
solved efficiently using the well-known simplex algorithm.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/9

Example 2: Non-linear programming


Given N observations: (x1, p(x1)), (x2, p(x2)),,
and (xN, p(xN)) of an unknown function p(x).
Find a polynomial q(x)=a0+a1x+a2x2, such that
N

min f (a0 , a1 , a2 ) = | p( xi ) q( xi ) | + q ( x)
i =1

Note that in this problem q(x) is also a function of


a0, a1, and a2.
The first term is the approximation error, and the
second term is regularization factor that can make
the solution better (e.g. smoother).
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/10

Combinatorial optimization problems


If f(x) or gj(x) cannot be given analytically (in
closed-form), we have combinatorial problems.
For example, if x takes n discrete values (e.g.
integers), and if there are N variables, the
number of all possible solutions will be nN.
It is difficult to check all possible solutions in
order to find the best one(s).
In such cases, meta-heuristics can provide
efficient ways for obtaining good solutions using
limited resources (e.g. time and memory space).
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/11

Example 3: Traveling salesman problem (TSP)


Given N users located in N
different places.
The problem is to find a route so
that the salesman can visit all
users once (and only once), start
from and return to his own
place (to find the Hamiltonian
cycle).

From Wikipedia

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/12

Example 3: Traveling salesman problem (TSP)


In TSP, we have a route map which can
be represented by a graph.
Each node is a user, and the edge
between each pair of nodes has a cost
(distance or time).
The evaluation function to be
minimized is the total cost of the route.

From Wikipedia

For TSP, the number of all possible solutions is N!, and


this is a well-known NP-hard combinatorial problem.
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/13

NP-hard and NP-complete


An optimization problem is called NP-hard if its
associated decision problem is NP-complete.
Any optimization problem can be reduced to a decision
problem (a problem that answers yes or no only).
NP-complete problems are a class of decision problems
that can be solved by a non-deterministic algorithm in
polynomial time.
Problems that can be solved by a deterministic
algorithm in polynomial time is called class P.
If we can find a deterministic algorithm for solving one
of the NP-complete problem, we can solve all other
NP-complete problems.
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/14

Example 4: The Knapsack problem


Knapsack problem is another NPcomplete problem defined by:

There are N objects;


Each object has a weight and a value;
The knapsack has a capacity;
The user has a quota (minimum desired
value);

The problem is to find a sub-set of the objects


that can be put into the knapsack and can
maximize the total value.
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/15

Example 4: The Knapsack problem


KNAPSACK (in OS : set of objects; QUOTA : number; CAPACITY : number;
out S : set of objects; FOUND : boolean)
Begin S := empty;
total_value := 0;
This is a non-deterministic
total_weight := 0;
FOUND := false;
algorithm. Each time we run
pick an order L over the objects;
the program, we get a
loop
different answer. By chance,
choose an object O in L;
add O to S;
we may get the best answer.
total_value:= total_value + O.value;
total_weight:= total_weight + O.weight;
if total_weight > CAPACITY then fail
else if total_value > = QUOTA
FOUND:= true;
succeed;
end
end
delete all objects up to O from L;
end
end
An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/16

Example 5: Learning problems


Many optimization problems related
to inductive learning (learning from a
given set of training data) are NPhard/complete.
Examples included:
Finding the smallest feature sub-set;
Finding the most informative training
data set;
Finding the smallest decision tree;
Finding the best clusters;
Finding the best neural network;
Interpret a learned neural network.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/17

Homework
Try to find some other (at least two) examples
of optimization problems from the Internet.
Tell if the problems are NP or P.
Provide a solution (not necessarily the best
one) for each of the problems.

An Introduction to Metaheuristics: Produced by Qiangfu Zhao (Since 2012), All rights reserved (C)

Lec01/18

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy