1 of 38

TNK104 Applied Optimization

Review Session

Lecture 11

Tatiana Polishchuk,

Associate Professor,

Linköping University, KTS

https://www.itn.liu.se/~tatpo46/

2 of 38

2

TNK104 Applied Optimization I

Topological Sort

Given a directed acyclic graph G = (V, E), visit and print the vertices in an order such that each vertex v is printed before any vertex u with (v, u) E

Observation: At least one vertex has in degree = 0 in a directed acyclic graph

A possible solution is v4 , v8 , v3 , v2 , v1 , v5 , v7 , v6

3 of 38

Topological Sort (cont’d)

3

TNK104 Applied Optimization I

Time complexity?

4 of 38

Topological Sort (cont’d)

4

TNK104 Applied Optimization I

Time complexity?

Linear O(V+E)

5 of 38

Shortest Path

5

TNK104 Applied Optimization I

  • Find a path from one vertex (aka source) to another vertex (aka destination) with minimum total weight (length, cost…)
  • Find minimum-weight paths from the source to all other vertices
  • Typically defined on directed (and connected) graphs
  • Special case: unweighted shortest path (length = number of edges)

6 of 38

Shortest Path Tree

6

TNK104 Applied Optimization I

Consider the problem of finding shortest paths from v5 to all other vertices for the example graph

By the way: Is the solution always a tree?

Solution:

7 of 38

Solution Representation of the Shortest Path Tree

7

TNK104 Applied Optimization I

8 of 38

Dijkstra’s Algorithm

8

TNK104 Applied Optimization I

Let’s “decode” by executing it …

Time complexity?

9 of 38

Dijkstra’s Algorithm

9

TNK104 Applied Optimization I

Greedy algorithm (detailed in lect 3)

Efficient: complexity O(V^2) (depends on implementation of the priority queue)

But! Does not work with negative weights (see examples in the self-study material)

Alternative: Bellman-Ford (dynamic programming approach)

robust, but less efficient: O(EV)

Output: graph, highlight the shortest path

�code example:

shortest_path= digraph(arcs_to_last(:,1),arcs_to_last(:,2)); highlight(p,shortest_path_to_last_graph);

10 of 38

IP (integer) and MIP (mixed-integer) Model

10

TNK104 Applied Optimization I

  • Some variables are integers
  • Non-convex, much harder to tackle than LP
  • LP relaxation: relax integer condition! gives a lower/upper bound
  • Integrality gap: The difference (if any) between the IP optimum and LP optimum

11 of 38

The Convex Hull

11

TNK104 Applied Optimization I

Convex hull: The minimum convex set containing all integer solutions

  • All extreme points of the convex hull are integral

Solve ILP = Solve the LP over the convex hull!

  • Sometimes the convex hull = LP relaxation
  • In general, an explicit description of the facets defining the convex hull remains a dream

Objective

12 of 38

The Convex Hull (cont’d)

12

TNK104 Applied Optimization I

  • The structure of the convex hull can be very complex
  • However, even partial knowledge of the convex hull is useful
  • Valid inequality (aka cut)
    • Valid = does not cut off any part of the convex hull
    • Cut generation (aka separation) is often hard

Objective

LP optimum

New LP optimum

13 of 38

13

Acknowledgement: Numerical example from L. Wolsey, Integer Programming, John Wiley & Sons, 1998

TNK104 Applied Optimization I

Example of Structured Valid Inequality: Cover Inequalities

Consider the set of solutions defined by

14 of 38

Cover Inequalities: A General Definition

14

TNK104 Applied Optimization I

15 of 38

Separation Problem

15

TNK104 Applied Optimization I

We are only interested in finding useful valid inequalities

Does the current fractional point (LP) violate some valid inequality?

that is, is there a valid inequality separating the point from the convex full?

16 of 38

Separation Problem (cont’d)

16

TNK104 Applied Optimization I

17 of 38

Inexact Algorithms: A Classification

17

TNK104 Applied Optimization I

  • Approximation algorithms
    • Not exact, but the solution has a performance guarantee, e.g., at most 50% worse than optimum
    • Problem-specific
  • Heuristics (Greek: heurisko = “I find”): Designed to give an ad hoc (but hopefully good) solution; no performance guarantee
    • Construction/greedy heuristics
    • Local search
    • Metaheuristics (simulated annealing, tabu search, genetic algorithms, GRASP, memetic algorithms, ant colony optimization …)
    • Integer-programming-based heuristics

18 of 38

RE: Greedy Algorithm – A General Definition

18

  • A greedy algorithm is an algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage[1] with the hope of finding a global optimum.
  • A greedy strategy does not in general produce an optimal solution.
  • A greedy heuristic may yield locally optimal solutions that approximate a global optimal solution in a reasonable time.

Note that the following algorithms are greedy:

    • Dijkstra’s (Shortest path)
    • Prim’s (MST)
    • Kruskal’s (MST)

Because they are based on a greedy principle/strategy!

[1] Black, Paul E. (2 February 2005). "greedy algorithm". Dictionary of Algorithms and Data Structures.

U.S. National Institute of Standards and Technology (NIST). Retrieved 17 August 2012.

19 of 38

Greedy Algorithm for Binary Knapsack

19

TNK104 Applied Optimization I

  • Consider the binary knapsack problem
  • How to apply the greedy-construction principle?

Example 1: Knapsack Capacity W = 25 and

Item

A

B

C

D

Price

12

9

9

5

Size

24

10

10

7

  • Optimal: B and C. Size=10+10=20. Price=9+9=18
  • Possible greedy approach: take largest Price first (Price=12, not optimal)
  • Possible greedy approach: take smallest size item first (Price=9+5=14, not optimal)

20 of 38

Greedy Algorithm for Binary Knapsack (cont’d)

20

TNK104 Applied Optimization I

Does the algorithm guarantee optimality?

Example 2: Knapsack Capacity = 30

Possible greedy approach: take largest ratio: (Solution: A and B. Size=5+20=25. Price=50+140=190

Optimal: B and C. Size=20+10=30. Price=140+60=200

What are the lesson we learned?

Item

A

B

C

Price

50

140

60

Size

5

20

10

Ratio

10

7

6

21 of 38

Greedy Algorithm for Binary Knapsack (cont’d)

21

TNK104 Applied Optimization I

Does the algorithm guarantee optimality?

Example 2: Knapsack Capacity = 30

Possible greedy approach: take largest ratio: (Solution: A and B. Size=5+20=25. Price=50+140=190

Optimal: B and C. Size=20+10=30. Price=140+60=200

What are the lesson we learned?

Greedy doesn't work for 0-1 Knapsack Problem

Item

A

B

C

Price

50

140

60

Size

5

20

10

Ratio

10

7

6

22 of 38

Metaheuristics

Nikolaos Pappas, ITN, LiU

22

TNK104 Applied Optimization I

  • Construction heuristic: very fast, returns one solution
  • Local search: needs more computation, examines alternative solutions in a local environment, returns a local optimum
  • Meta heuristics: even more computation, aiming at better solutions than one local optimum

Examples of meta heuristics:

    • Simulated annealing
    • Tabu search
    • Genetic algorithms
    • GRASP
  • Meta = A general concept applicable to various problems

23 of 38

Local Optima

Nikolaos Pappas, ITN, LiU

23

TNK104 Applied Optimization I

A local optimum can be very far from the global optimum in objective value

One dimension

Two dimensions

24 of 38

Tabu Search

24

TNK104 Applied Optimization I

  • Invented by Glover (1986), the principle has become a very popular meta heuristic for hard problems
  • Local search + memory
  • The algorithm “remembers” solutions previously visited
  • The memory is also known as the tabu list
  • The list is consulted in selecting the next move
  • Moves going back to earlier solutions are forbidden (= tabu status)

→ No cycles

→ Avoid moves leading back to a visited local optimum

25 of 38

The Tabu Search Algorithm

25

TNK104 Applied Optimization I

26 of 38

Crucial Aspects of Tabu Search

26

TNK104 Applied Optimization I

  • What move attribute(s) should the tabu list contain?

(2-exchange for TSP: edges deleted, edges added, or both?)

  • How long should the list be (i.e., for how many iterations should the solution attribute remain tabu?)
  • Too short: The algorithm behaves like local search

may easily get stuck in local optimum, or cycling!

  • Too long: Many new potential solutions are banned under long time

→ may fail to find good solutions

27 of 38

An Illustration of Tabu Search for k-Cardinality Tree

27

TNK104 Applied Optimization I

k-cardinality tree: Find a minimum-cost tree of cardinality k (i.e., k links).

Neighbourhood definition:

  • Delete the link of a leaf (the remaining links form a k 1 cardinality tree)
  • Add another link (minimum cost) to obtain a new tree of k links

28 of 38

An Illustration of Tabu Search for k-Cardinality Tree (cont’d)

28

TNK104 Applied Optimization I

Consider tabu search with tabu list length = 1 (simplification)

29 of 38

29

TNK104 Applied Optimization I

An Illustration of Tabu Search for k-Cardinality Tree (cont’d)

30 of 38

30

TNK104 Applied Optimization I

An Illustration of Tabu Search for k-Cardinality Tree (cont’d)

31 of 38

An Illustration of Tabu Search for k-Cardinality Tree (cont’d)

TNK104 Applied Optimization I

32 of 38

Lagrangian Relaxation

TNK104 Applied Optimization I

33 of 38

TNK104 Applied Optimization I

Lagrangian Relaxation (cont’d)

34 of 38

Lagrangian Relaxation for Knapsack (cont’d)

TNK104 Applied Optimization I

35 of 38

Subgradient Optimization

TNK104 Applied Optimization I

36 of 38

Subgradient Optimization (cont’d)

TNK104 Applied Optimization I

37 of 38

Questions?

38 of 38

Good luck!