0% found this document useful (0 votes)
48 views14 pages

Lecture 2-History

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1/ 14

ENG 507

Mathematical Optimization
(and Computational Intelligence)
Lecture Notes

A Brief History
of optimization
or of mathematical programming

Dr. Raed I. Bourisli


Mechanical Engineering Dept.
Kuwait University
History
• An optimization problem consists of maximizing or minimizing a real
function by systematically choosing input values from within an allowed
set (domain) and computing the value of the function.
• More generally, optimization includes finding "best available" values of
some objective function given a defined domain, including a variety of
different types of objective functions and different types of domains,
satisfying some sets of constraints.
• Pierre de Fermat and Joseph-Louis Lagrange found calculus-based
formulae for identifying optima, while Sir Isaac Newton and Carl Friedrich
Gauss proposed iterative methods for moving towards an optimum.
History
• Much of the theory had been introduced
by Leonid Kantorovich in 1939 (Military
Engineering-Technical University, Russia).
• The first term for optimization was
"linear programming,” which was due
to Tjalling C. Koopmans and his student
George B. Dantzig (Stanford) who
published the Simplex algorithm in 1947.
• John von Neumann developed the theory of duality
in the same year.

The term programming in this context does not refer to computer programming.
Rather, the term comes from the use of program by the United States military to refer
to proposed training and logistics schedules, which were the problems Dantzig
studied at that time.
General Order of Complexity
• Unconstrained Optimization (Least Squares problems, etc.)
• Unconstrained Optimization (derivatives can be calculated, mostly):
– Line search methods
– Newton methods
– Conjugate gradient methods
– “Nonlinear” methods
• Constrained Optimization: Linear Programming (LP) problems
– The Simplex Method
– Interior-Points methods
– Quadratic Programming (QP)
– Penalty and Augmented Lagrangian Methods
– Sequential Quadratic Programming
• Constrained Optimization: Nonlinear Programming, Convex Optimization
– Lagrange and Kuhn-Tucker Multiplies
– Penalty functions
– Barrier functions
Optimization and Engineering Practice
• Most of the mathematical models you are familiar with
Types of “Analytics”

have been descriptive models; they have been derived to


simulate the behavior of an engineering device or system.
• One level up from purely descriptive models are the
predictive models, ones that help you forecast future
performance of and results of a system/process.
• In contrast, optimization typically deals with finding the
“best results,” or optimum solution, of a problem. Thus,
in the context of modeling, they are often termed
prescriptive models, since they can be used to
prescriptibe a course of action or the best design.
Examples of optimization problems

Optimization in Engineering
• Design aircraft for minimum weight and maximum strength.
• Optimal trajectories of space vehicles.
• Design civil engineering structures for minimum cost.
• Design water-resource projects like dams to mitigate flood damage while yielding maximum
hydropower.
• Material-cutting strategy for minimum cost.
• Design pump and heat transfer equipment for maximum efficiency.
• Maximize power output of electrical networks and machinery while minimizing heat
generation.
• Shortest route of salesperson visiting various cities during one sales trip.
• Optimal planning and scheduling.
• Statistical analysis and models with minimum error.
• Optimal pipeline networks.
• Inventory control.
• Maintenance planning to minimize cost.
• Minimize waiting and idling times.
• Design waste treatment systems to meet water-quality standards at least cost.
Examples of optimization problems

Numerical Optimization Problems


• Problems that are defined over problem spaces
which are subspaces of (uncountable infinite)
numerical spaces, such as the real or complex
vectors, X ⊆ Rn or X ⊆ Cn.
– Examples include functional optimization, engineering
design optimization tasks, classification and data mining
tasks.
– They often can efficiently be solved with Evolution
Strategies, Differential Evolution, Evolutionary
Programming , Estimation of Distribution Algorithms or
Particle Swarm Optimization.
Examples of optimization problems

Function Root and Minimization


• Find the roots of the function g(x) defined by,

Minimize:
f1 (x) = |0 − g(x)| = |g(x)|
or,
f2 (x) = [0 − g(x)]2 = [g(x)]2
Examples of optimization problems

Lightest, Most Stable Beams


• The thickness di of each of the i = 1...n beams which
leads to the most stable truss whilst not exceeding a
maximum volume of material V would be subject to
optimization.
Examples of optimization problems

Combinatorial Optimization Problems


• Problems which are defined over a finite (or
numerable infinite) discrete problem space X and
whose candidate solutions structure can be
expressed as elements from finite sets.
– Examples include the Traveling Salesman Problem (TSP),
Vehicle Routing Problems, graph coloring, graph
partitioning, scheduling, packing, and satisfiability
problems.
– Algorithms suitable for such problems are, amongst
others, Genetic Algorithms, Simulated Annealing, Tabu
Search, and Extremal Optimization.
Examples of optimization problems

Traveling Salesman Problem


• Find the shortest path between cities.

4
3

2 1
Examples of optimization problems

Job Shop Scheduling


• JSS is one of the hardest classes of scheduling problems (NP-
complete) where the goal is to distribute a set of tasks to

machines in a way that all deadlines are met.
In a JSS, a set T of jobs ti ∈ T is given, where each job ti consists of a number of sub-tasks ti,j . A partial order is defined on the sub-tasks ti,j of each job ti which describes the precedence of the sub-tasks, i. e., constraints in which order they must be processed.

– A beverage factory has to process 10,000 bottles of cola (t1 . . . t10,000) and 5000
bottles of lemonade (t10,001 . . . t15,000).
– For the cola bottles, the jobs would consist of three subjobs, ti,1 to ti,3 for i ∈
1..10,000, since the bottles first need to labeled, then they are filled with cola,
and finally closed. It is clear that the bottles cannot be closed before they are
filled ti,2 ≺ ti,3, whereas the labeling can take place at any time.
– For the lemonade, all production steps are the same as the ones for the cola
bottles, except that step ti,2 ∀i ∈ 10,001..15,000 is replaced with filling the
bottles with lemonade.
– It is obvious that the different sub-jobs have to be performed by different
machines, mi ∈ M. Each of the machines mi can only process a single job at a
time.
– Finally, each of the jobs has a deadline until which it must be finished.
Examples of optimization problems

Job Shop Scheduling


• The optimization algorithm has to find a schedule S which defines for each
job when it has to be executed and on what machine, while adhering to
the order and deadline constraints.
• Usually, the optimization process would start with random configurations
which violate some of the constraints (typically the deadlines) and step by
step reduces the violations until results are found that fulfill all
requirements.
• The JSS can be considered as a combinatorial problem since the optimizer
creates a sequence and assignment of jobs and the set of possible starting
times (which make sense) is also finite.
Problem Type Relations

“The area of Global Optimization is a living and breathing research discipline.”


Thomas Weise

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy