Optimization: principles and algorithms, by Michel Bierlaire
symmetricRankOne.m
Go to the documentation of this file.
1 %> \file
2 %> Algorithm 13.2: SR1 method with trust region region. Implementation of algorithm 13.2 of \cite Bier15-book
3 %>
4 %> @author Michel Bierlaire
5 %> @date Sat Mar 21 16:35:22 2015
6 %> @ingroup Algorithms
7 %> @ingroup chap13
8 
9 %> Applies SR1 algorithm with trust region to solve \f$\min_x f(x)\f$ where \f$f:\mathbb{R}^n\to\mathbb{R}\f$. The parameters of the method are taken from \cite ConnGoulToin00 (p. 117).
10 
11 %> @note Tested with \ref run0508sr1.m
12 %> @note Tested with \ref runRosenbrockSr1.m
13 %> @note Calls \ref dogleg
14 %> @note Calls \ref truncatedConjugateGradient
15 
16 %> @param obj the name of the Octave function defining f(x) and its derivatives
17 %> @param x0 the starting point
18 %> @param delta0 radius of the initial trust region
19 %> @param eps algorithm stops if \f$\|F(x)\| \leq \varepsilon \f$.
20 %> @param tr method to solve the trust region subproblem. If 0, the dogleg method is used. If different from 0, the truncated conjugate gradient is used (default: 0).
21 %> @param maxiter maximum number of iterations (Default: 100)
22 %> @return [solution,iteres,niter]
23 %> @return solution: local minimum of the function
24 %> @return iteres: sequence of iterates generated by the algorithm. It contains n+2 columns. Columns 1:n contains the value of the current iterate. Column n+1 contains the value of the objective function. Column n+2 contains the value of the norm of the gradient. It contains maxiter rows, but only the first niter rows are meaningful.
25 %> @return niter: total number of iterations
26 function [solution,iteres,niter] = symmetricRankOne(obj,x0,delta0,eps, tr=0,maxiter=100)
27  addpath("../chap12") ;
28  STSR1 = "" ;
29  iteres = zeros(1+ maxiter,4) ;
30 
31  eta1 = 0.01 ;
32  eta2 = 0.9 ;
33  k=0 ;
34  xk = x0 ;
35  n = size(x0,1) ;
36  H = eye(n,n) ;
37  [f,g] = feval(obj,xk) ;
38  iteres(1,:) = [xk' f norm(g) ] ;
39  k = 0 ;
40  delta = delta0 ;
41 
42 # Store the radius of the trust region to generate figures like Fig 13.8.
43  [fff,msg] = fopen("radius.dat","w") ;
44  fprintf(fff,"%d %e\n",k,delta) ;
45  do
46  k=k+1 ;
47 
48  if (tr == 0)
49  [step,type] = dogleg(g,H,delta) ;
50  else
51  [step,type] = truncatedConjugateGradient(g,H,delta) ;
52  endif
53  [fc,gc] = feval(obj,xk+step) ;
54  y = gc - g ;
55  num = f - fc;
56  denom = -step'*g - 0.5 * step' * H * step ;
57  rho = num / denom ;
58  fprintf(fff,"%e %e %e ",num,denom,rho) ;
59  if (rho < eta1)
60  delta = norm(step) / 2.0 ;
61  status = "- " ;
62  else
63  xk = xk + step ;
64  f = fc ;
65  g = gc ;
66  if (rho >= eta2)
67  delta = 2 * delta ;
68  status = "++" ;
69  else
70  status = "+ " ;
71  endif
72  term = y - H * step ;
73  sr1denom = step' * (term) ;
74  STSR1 = "" ;
75  if (abs(sr1denom) >= 1.0e-8 * norm(step) * norm(term))
76  STSR1 = "SR1" ;
77  H = H + (term * term') / sr1denom ;
78  endif
79  endif
80 
81  iteres(k+1,:) = [xk' f norm(g) ] ;
82  fprintf(fff,"%d %e\n",k,delta) ;
83  until (norm(g) <= eps || k >= maxiter)
84 
85  fclose(fff) ;
86  solution = xk ;
87  niter = k ;
function symmetricRankOne(in obj, in x0, in delta0, in eps, in tr)
Applies SR1 algorithm with trust region to solve where . The parameters of the method are taken from...
function dogleg(in g, in H, in delta)
Given a current iterate , consider the trust region subproblem subject to Apply the Dogleg method t...
function truncatedConjugateGradient(in g, in H, in delta)
Given a current iterate , consider the trust region subproblem subject to Apply the truncated conju...
Copyright 2015-2018 Michel Bierlaire