Optimization: principles and algorithms, by Michel Bierlaire
newtonTrustRegion.m
Go to the documentation of this file.
1 %> \file
2 %> Algorithm 12.4: Newton's method with trust region. Implementation of algorithm 12.4 of \cite Bier15-book
3 %>
4 %> @author Michel Bierlaire
5 %> @date Sat Mar 21 16:04:27 2015
6 %> @ingroup Algorithms
7 %> @ingroup chap12
8 
9 %> Applies Newton's algorithm with trust region to solve \f$\min_x f(x)\f$ where \f$f:\mathbb{R}^n\to\mathbb{R}\f$. The parameters of the method are taken from \cite ConnGoulToin00 (p. 117).
10 
11 %> @note Tested with \ref run0508tr.m
12 %> @note Tested with \ref runRosenbrockTrustRegion.m
13 %> @note Calls \ref dogleg
14 %> @note Calls \ref truncatedConjugateGradient
15 
16 %> @param obj the name of the Octave function defining f(x) and its derivatives
17 %> @param x0 the starting point
18 %> @param delta0 radius of the initial trust region
19 %> @param eps algorithm stops if \f$\|F(x)\| \leq \varepsilon \f$.
20 %> @param tr method to solve the trust region subproblem. If 0, the dogleg method is used. If different from 0, the truncated conjugate gradient is used (default: 0).
21 %> @param maxiter maximum number of iterations (Default: 100)
22 %> @return [solution,iteres,niter]
23 %> @return solution: local minimum of the function
24 %> @return iteres: sequence of iterates generated by the algorithm. It contains n+2 columns. Columns 1:n contains the value of the current iterate. Column n+1 contains the value of the objective function. Column n+2 contains the value of the norm of the gradient. It contains maxiter rows, but only the first niter rows are meaningful.
25 %> @return niter: total number of iterations
26 function [solution,iteres,niter] = newtonTrustRegion(obj,x0,delta0,eps, tr=0,maxiter=100)
27 
28  iteres = zeros(1+ maxiter,4) ;
29 
30 
31  eta1 = 0.01 ;
32  eta2 = 0.9 ;
33  k=0 ;
34  xk = x0 ;
35  [f,g,H] = feval(obj,xk) ;
36  iteres(1,:) = [xk' f norm(g) ] ;
37  k = 0 ;
38  delta = delta0 ;
39  printf("%+10.5e,%+10.5e\n",xk(1),xk(2)) ;
40  do
41  k=k+1 ;
42  if (tr == 0)
43  [step,type] = dogleg(g,H,delta) ;
44  else
45  [step,type] = truncatedConjugateGradient(g,H,delta) ;
46  endif
47  [fc,gc,Hc] = feval(obj,xk+step) ;
48  num = f - fc;
49  denom = -step'*g - 0.5 * step' * H * step ;
50  rho = num / denom ;
51  if (rho < eta1)
52  delta = norm(step) / 2.0 ;
53  status = "- " ;
54  else
55  xk = xk + step ;
56  f = fc ;
57  g = gc ;
58  H = Hc ;
59  if (rho >= eta2)
60  delta = 2 * delta ;
61  status = "++" ;
62  else
63  status = "+ " ;
64  endif
65  endif
66  iteres(k+1,:) = [xk' f norm(g) ] ;
67  until (norm(g) <= eps || k >= maxiter)
68  solution = xk ;
69  niter = k ;
function dogleg(in g, in H, in delta)
Given a current iterate , consider the trust region subproblem subject to Apply the Dogleg method t...
function truncatedConjugateGradient(in g, in H, in delta)
Given a current iterate , consider the trust region subproblem subject to Apply the truncated conju...
function newtonTrustRegion(in obj, in x0, in delta0, in eps, in tr)
Applies Newton&#39;s algorithm with trust region to solve where . The parameters of the method are taken...
Copyright 2015-2018 Michel Bierlaire