37
Sebastian Ueckert, Joakim Nyberg, Andrew C. Hooker Application of Quasi- Newton Algorithms in Optimal Design Pharmacometrics Research Group Department of Pharmaceutical Biosciences Uppsala University Sweden

Application of Quasi-Newton Algorithms in Optimal Design

  • Upload
    menora

  • View
    39

  • Download
    1

Embed Size (px)

DESCRIPTION

Application of Quasi-Newton Algorithms in Optimal Design . Sebastian Ueckert, Joakim Nyberg, Andrew C. Hooker. Pharmacometrics Research Group Department of Pharmaceutical Biosciences Uppsala University Sweden. Outline. Optimizing Designs Introduction: Quasi-Newton Methods (QNMs) - PowerPoint PPT Presentation

Citation preview

Page 1: Application of Quasi-Newton Algorithms in Optimal Design

Sebastian Ueckert, Joakim Nyberg, Andrew C. Hooker

Application of Quasi-Newton Algorithms in Optimal Design

Pharmacometrics Research GroupDepartment of Pharmaceutical BiosciencesUppsala UniversitySweden

Page 2: Application of Quasi-Newton Algorithms in Optimal Design

Outline1. Optimizing Designs2. Introduction: Quasi-Newton Methods (QNMs)3. Performance QNMs4. Advantages QNMs5. Laplace Approximation for Global Optimal Design6. Using QNMs in Laplace Approximation

2

Page 3: Application of Quasi-Newton Algorithms in Optimal Design

Optimizing a Design

( ) ( , )j x f F x

* arg max ( )x X

x j x

Model

e.g. D-Optimal Design ( ) ,j x F x

Parameters α Design variables x

3

DataEstimation

Page 4: Application of Quasi-Newton Algorithms in Optimal Design

Optimization• Interval methods

True global optimizersHard to implementStill under development

• Stochastic methods – Simulated Annealing (SA), Ant colony optimization, Genetic Algorithm(GA) Easy to implement (SA)

Marketing effective (GA)SlowNo information about solutionHeuristic

4

Page 5: Application of Quasi-Newton Algorithms in Optimal Design

Optimization• Derivative free methods

– Downhill Simplex MethodNo derivatives necessaryRobustSlowLocal

5

Page 6: Application of Quasi-Newton Algorithms in Optimal Design

Gradient Based Methods

6

Page 7: Application of Quasi-Newton Algorithms in Optimal Design

Gradient Based Methods

7

Page 8: Application of Quasi-Newton Algorithms in Optimal Design

Gradient Based Methods

8

Page 9: Application of Quasi-Newton Algorithms in Optimal Design

Gradient Based Methods

9

Page 10: Application of Quasi-Newton Algorithms in Optimal Design

Gradient Based Methods

10

Page 11: Application of Quasi-Newton Algorithms in Optimal Design

Gradient Based Methods

11

Page 12: Application of Quasi-Newton Algorithms in Optimal Design

Gradient Based MethodsMathematically well understoodFast (if OFV calc not too expensive) Only localComplicated to implement

• Steepest Descent• Conjugate Gradient

12

Page 13: Application of Quasi-Newton Algorithms in Optimal Design

1. Set xk=x0

2. Determine search direction

3. Do line search along p* to find minimal xk+1

4. Set xk=xk+1 and go to 2

Newton Method

212( ) : ( )T T

k k k k kf x p f p f p f p m p 2( ) 0k k km p f f p

* arg max ( )x R

x f x

* 2 1k kp f f

Goal:

Algorithm:

13

Page 14: Application of Quasi-Newton Algorithms in Optimal Design

1. Set xk=x0

2. Determine search direction

3. Do line search along p* to find minimal xk+1

4. Set xk=xk+1 and go to 2

Newton Method

212( ) : ( )T T

k k k k kf x p f p f p f p m p 2( ) 0k k km p f f p

* arg max ( )x R

x f x

* 2 1k kp f f

Goal:

Algorithm:

Calculate Hessian

14

Page 15: Application of Quasi-Newton Algorithms in Optimal Design

1. Set xk=x0, Bk=I2. Determine search direction

3. Do line search along p* to find minimal xk+1

4. Set xk=xk+1, Bk=Bk+Uk and go to 2

Quasi-Newton Methods

12( ) : ( )T T

k k k k k kf x p f p f p B f p m p

( ) 0k k km p f B p * 1

k kp B f

Algorithm:

Calculation of Hessian is computationally expensiveProblem:

Approach: Use approx. Hessian and build up during search

15

Page 16: Application of Quasi-Newton Algorithms in Optimal Design

Quasi-Newton Methods• Different methods for different updating formulas

– Davidon–Fletcher–Powell (DFP)

– Broyden-Fletcher-Goldfarb-Shanno (BFGS) 1

TTk k k kk k

k k T Tk k k k k

B x B xy yB By x x B x

1

T T Tk k k k k k

k kT T T T Tk k k k k k

y x y x y yB I B I

y x y x y x

1k k ky f f 1k k ks x x

16

Page 17: Application of Quasi-Newton Algorithms in Optimal Design

Constraints• Experiments usually come with practicality constraints e.g.:

– Administered dose has to be smaller than X mg– Sampling times can only be taken until 8 h after dosing

i i iu x b Box Constraints

BFGS-B 17

Page 18: Application of Quasi-Newton Algorithms in Optimal Design

BFGS-B

1. Set xk=x0, Bk=I2. Determine search direction

3. Project search direction vector on feasible region4. Do line search along p* to find minimal xk+1 respecting bounds5. Set xk=xk+1, Bk=Bk+Uk and go to 2

Algorithm:

12( ) : ( )T T

k k k k k kf x p f p f p B f p m p

( ) 0k k km p f B p * 1

k kp B f

18

Page 19: Application of Quasi-Newton Algorithms in Optimal Design

Comparison• Test Scenario

– Model:• PKPD (1 cmp oral absorption; IMAX drug effect)• All parameters (ka,CL,V,IC50, E0, IMAX) with log-normal IIV 30% CV• PK parameters fixed• Combined error

– Design:• 3 groups (40,30,30 subjects)• 1 PK and 1 PD sample per subject

• Approach:– Generate random initial values – Optimize with steepest descent and BFGS

19

Page 20: Application of Quasi-Newton Algorithms in Optimal Design

Results

BFGS Steepest Descent

010

2030

4050

60

15.03

60.84

Runti

me

[s]

Freq

uenc

y[%

]

Steepest Descent

BFGS 20

OFV0 2 4 6 8

x 1010

0

5

10

15

20

25

30

Page 21: Application of Quasi-Newton Algorithms in Optimal Design

Design Sensitivity• Approximate Hessian matrix can be used to assess sensitivity

of design (at no additional computational costs)– Diagonal of the inverse of the Hessian– Use approximate efficiency

*

( )( ) j xEff xj x

* * * *12( ) T Tj x a j a j a B a

* * *12

*( )T Tj a j a B a

Eff aj

21

Page 22: Application of Quasi-Newton Algorithms in Optimal Design

Design Sensitivity - Visual

7 7.5 8 8.5 9 9.50.9

0.92

0.94

0.96

0.98

1

1.02

6 6.5 7 7.5 8 8.50.9

0.92

0.94

0.96

0.98

1

1.02

1 1.5 2 2.5 30.9

0.92

0.94

0.96

0.98

1

1.02

Group 2 PD Group 1 PK Group 1 PD

22

Page 23: Application of Quasi-Newton Algorithms in Optimal Design

Design Sensitivity - Numerical

PK Sample PD SampleGroup 1

7.12 [0.35;13.9]8.38[5.28;11.38]

Group 2

1.26 [0;3.74]1.79[1.03;2.55]

Group 3

9.22 [-1.31E;+1.31E] 0[0;0.0025]

0 20 40 60 80 10080

85

90

95

100

Group 2 PD

0 20 40 60 80 10080

85

90

95

100

0 20 40 60 80 10080

85

90

95

100

Group 1 PK Group 3 PK23

Page 24: Application of Quasi-Newton Algorithms in Optimal Design

LAPLACE APPROXIMATION

24

Page 25: Application of Quasi-Newton Algorithms in Optimal Design

Global Optimal Design• Integral has to be evaluated• FIM occurs in integrand• For example ED optimal design:

• Usually evaluated with Monte-Carlo integrationComputationally intensive or imprecise

( ) ( ) ( , )EDj x p F x d

25

Page 26: Application of Quasi-Newton Algorithms in Optimal Design

Laplace Approximation

,( ) ( ) ( , ) k xEDj x p F x d e d

, : log ( ) ( , )k x p F x

,

2

1

2 ,

mk x

me

k x

arg min ,m k x

26

Page 27: Application of Quasi-Newton Algorithms in Optimal Design

Laplace Approximation

1. Minimize

2. Calculate the Hessian

3. Evaluate

Algorithm:

, : log ( ) ( , )k x p F x

2 ,mk x

,

2

1

2 ,

mk x

me

k x

B

27

Page 28: Application of Quasi-Newton Algorithms in Optimal Design

Laplace-BFGS Approximation

1. Minimize using BFGS algorithm

2. Evaluate

Algorithm:

, : log ( ) ( , )k x p F x

,12

mk xe

B

28

Page 29: Application of Quasi-Newton Algorithms in Optimal Design

Laplace-BFGS – Random Effects

arg min ,m k x

( )g e

Problem:

Approach:

For variance parameter α ≥ 0

Perform optimization on log-domain

1. Minimize using BFGS algorithm

2. Rescale approximate Hessian

3. Evaluate

, : log ( ( )) ( ( ), )k x p g F g x

( ),1

2

mk g xe

B

Algorithm:

1TB B g g

29

Page 30: Application of Quasi-Newton Algorithms in Optimal Design

Comparison• Comparison of 4 algorithms:

1. Monte Carlo integration with random sampling (MC-RS)2. Monte Carlo integration with Latin hypercube sampling (MC-LHS)3. Laplace integral approximation (LAPLACE)4. Laplace integral approximation with BFGS Hessian (LAPLACE-BFGS)

• Testing MC methods with 50 and 500 random samples

30

Page 31: Application of Quasi-Newton Algorithms in Optimal Design

Comparison• Test Scenario

– Model:• 1 cmp IV bolus• CL,V with log-normal IIV• Additive error

– Design:• 20 subjects• 2 samples per subject

– Parameter distribution:• Log-normal an all parameters (Fixed effect Var=0.05; Random Effect

Var=0.09)

31

Page 32: Application of Quasi-Newton Algorithms in Optimal Design

Results - OFV

Method Mean OFV1021 [95% CI]MC-RS 100,000 3.24MC-RS 50 3.27[2.2-5.0]MC-RS 500 3.33[2.8-3.8]MC-LHS 50 3.24[2.2-4.6]MC-LHS 500 3.22[2.9-3.7]LAPLACE 2.95LAPLACE-BFGS 3.01

Mean OFV and non-parametric confidence intervals for different integration methods from 100 evaluations

32

Page 33: Application of Quasi-Newton Algorithms in Optimal Design

Results - DesignMC-RS 50 MC-LHS 50 LAPLACE

LAPLACE-BFGSMC-RS 500 MC-LHS 500

33

Page 34: Application of Quasi-Newton Algorithms in Optimal Design

Results – Runtimes

MC-LHS 50 MC-RS 50 LAPLACE-BFGS LAPLACE MC-LHS 500 MC-RS 500

01

23

4

0.35 0.37 0.460.63

3.533.67

Runt

ime

[s]

34

Page 35: Application of Quasi-Newton Algorithms in Optimal Design

Conclusions• Quasi-Newton methods constitute fast alternative for

continuous design variable optimization • Information about design sensitivity can be obtained with no

additional cost• Global Optimal Design:

– Monte-Carlo methods are easy and flexible but need high number of samples to give stable results

– Laplace approximation constitutes fast alternative for priors with continuous probability distribution function

– Laplace integral approximation with BFGS Hessian gave same sampling times with approx. 30% shorter runtimes

35

Page 36: Application of Quasi-Newton Algorithms in Optimal Design

THANK YOU!

36

Page 37: Application of Quasi-Newton Algorithms in Optimal Design

References1) C.G. Broyden, “The Convergence of a Class of Double-rank Minimization Algorithms 1. General

Considerations,” IMA J Appl Math, vol. 6, Mar. 1970, pp. 76-90. 2) R. Fletcher, “A new approach to variable metric algorithms,” The Computer Journal, vol. 13,

1970, p. 317. 3) D. Goldfarb, “A family of variable-metric methods derived by variational means,” Mathematics

of Computation, 1970, pp. 23–26. 4) D.F. Shanno, “Conditioning of quasi-Newton methods for function minimization,” Mathematics

of Computation, 1970, pp. 647–656. 5) R.H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained

optimization,” SIAM J. Sci. Comput., vol. 16, 1995, pp. 1190-1208. 6) M. Dodds, A. Hooker, and P. Vicini, “Robust Population Pharmacokinetic Experiment Design,”

Journal of Pharmacokinetics and Pharmacodynamics, vol. 32, Feb. 2005, pp. 33-64.

37