Configuring Anytime algorithms · 2020. 12. 14. · Methodologies •Algorithm for configuration...

Preview:

Citation preview

Configuring Anytime algorithms

Predicting performance of Anytime algorithms

Rina Dechter

6/3/2016

Figaro DesiderataWe’d like to finalize the API by which the optimizer communicates with solvers. Here is what we propose:

1. We call analyzeProblem(factors)

2. We call getPerformanceEstimate(memoryBound)

3. Algorithms return one or more performance estimates, given the memory bound.

A performance estimate includes:

• A coarse time estimate, as one of the following six categories:

• Very Fast (order of less than one second), Fast (order of seconds),

• Medium (order of minutes), Slow (order of hours), Very Slow (order of days),

• or Infeasible.

• We will define reference tests for each of these categories that defines a dividing line between two categories.

For example, we might say that VE on problem X is the dividing line between Fast and Medium.

This will serve as a guide to you in choosing the category of a solver on a new problem. If you think it’s faster than VE on X, it will be Fast or Very Fast.

If you think it’s slower, it will be Medium or slower.

An accuracy estimate.

• This is chosen from a prespecified set of accuracies, e.g. Exact, 0.2% Error, 1% Error, 5% Error, or No Guarantees.

• An algorithm may return more than one time estimate with different accuracy estimates.

4. Our optimizer chooses the solver to use.

5. We call getSolution for the appropriate solver, passing in a desired accuracy

The problem

A = a parameterized anytime algorithm,

p= a problem instance, q= quality of a solution. 𝛿 = error on solution quality

• Given an anytime algorithm A, an instance p, and an error 𝛿, • configure A to achieve a bound 𝛿 error with minimal running time. Surrogates:

• Prediction: given a set of parameters for A, predict the time to get an error below 𝛿

• given a set of parameters for A, can we get an error below 𝛿, 𝑖𝑛 𝑡𝑖𝑚𝑒 𝑙𝑒𝑠𝑠 𝑡ℎ𝑎𝑛 𝑡.

• Given an anytime algorithm A, an instance p, and time t, • configure A so that the solution at t has the smallest error 𝛿.

• Prediction: Given a set of parameters and time t, predict the error 𝛿 at time t.

Methodologies

• Algorithm for configuration for SAT (Holder et. Al), supervised learning.

• Work on anytime algorithms by Zilberstein (1995)

• Using Anytime Algorithms in Intelligent Systems

• S. Zilberstein. AI Magazine, 17(3):73-83, 1996. [abs] [bib] [pdf

• S. Zilberstein. Ph.D. dissertation, Computer Science Division, University of California at Berkeley, 1993. [abs] [bib] [pdf]

• Prediction of number of nodes by sampling (Chen/Knuth)

Anytime algorithmsShlomo Zilberstein, AI Magazine survey article

Performance Profiles (PP)

ID for anytime components for BRAOBB

SLS-Time

CVO-Time

d,w*

FGLP-Time

UB_2

JGLP-TimeJ-bound

UB_3

MBE-MM-Time

UB_4

BRAOBB-Time

LB_5

Problem, Time, Memory

RegressionPrediction nodes

Sampling-prediction-Time # nodes

t_4

Optimal cost:C*

LB_5-LB_0LB_5-UB_4

C*-LB_5LB_0

Example: TSP

Conditional Performance Profiles

TSP CPPs

CPP for path planningInput quality: from vision system

Compilation of Vision and Planning

Categories of Performance Profiles

Categories (continued)

Conditional Performance profiles

A general simulation method can be used. It is based on gathering statistics on the

performance of the algorithm on randomly generated problem instances. Ideally, the

statistics are gathered for the same population of instances as will appear when the algorithm is deployed. This canbe ensured by learning the profiles during actual operation.

Dynamic Performance Profile

Recommended