Optimization: principles and algorithms, by Michel Bierlaire
newtonLineSearch.m
Go to the documentation of this file.
1 %> \file
2 %> Algorithm 11:8 Newton's method with line search . Implementation of algorithm 11.8 of \cite Bier15-book
3 %>
4 %> @author <a href="http://people.epfl.ch/michel.bierlaire">Michel Bierlaire</a>
5 %> @date Sat Mar 21 12:34:40 2015
6 %> @ingroup Algorithms
7 %> @ingroup chap11
8 
9 %> \note Tested with \ref run0508newton.m
10 %> \note Tested with \ref runRosenbrockNewtonLineSearch.m
11 %> \note Calls \ref modifiedCholesky
12 %> \note Calls \ref lineSearch
13 
14 %> Applies Newton's algorithm with line search to solve \f$\min_x f(x)\f$ where \f$f:\mathbb{R}^n\to\mathbb{R}\f$
15 %> @param obj the name of the Octave function defining f(x) and its derivatives
16 %> @param x0 the starting point
17 %> @param eps algorithm stops if \f$\|F(x)\| \leq \varepsilon \f$.
18 %> @param printlevel If 1, information about iterations is printed. If 0, it is silent.
19 %> @param maxiter maximum number of iterations (Default: 100)
20 %> @return solution local minimum of the function,
21 %> @return iteres sequence of iterates generated by the algorithm. It contains n+2 columns. Columns 1:n contains the value of the current iterate. Column n+1 contains the value of the objective function. Column n+2 contains the value of the norm of the gradient. It contains maxiter rows, but only the first niter rows are meaningful.
22 %> @return niter total number of iterations
23 function [solution,iteres,niter] = newtonLineSearch(obj,x0,eps,printlevel=1,maxiter=100)
24  iteres = zeros(1+ maxiter,4) ;
25  xk = x0 ;
26  [f,g,H] = feval(obj,xk) ;
27  iteres(1,:) = [xk' f norm(g) ] ;
28  k = 0 ;
29  if (printlevel != 0)
30  printf("f\t\t||g||\t\talpha\t\ttau\n");
31  printf("%e\t%e\n",f,norm(g)) ;
32  endif
33  alpha0 = 1.0 ;
34  beta1 = 1.0e-4 ;
35  beta2 = 0.99 ;
36  lambda = 2 ;
37  lsprintlevel=printlevel ;
38 do
39  [L,tau] = modifiedCholesky(H) ;
40  z = L \ g ;
41  d = - L' \ z ;
42  alpha = lineSearch(obj,xk,d,alpha0,beta1,beta2,lambda,lsprintlevel) ;
43  xk = xk + alpha * d ;
44  [f,g,H] = feval(obj,xk);
45  if (printlevel != 0)
46  printf("%e\t%e\t%f\t%e\n",f,norm(g),alpha,tau) ;
47  endif
48  k=k+1 ;
49  iteres(k+1,:) = [xk' f norm(g) ] ;
50  until (norm(g) <= eps || k >= maxiter)
51  solution = xk ;
52  niter = k ;
53 endfunction
function newtonLineSearch(in obj, in x0, in eps, in printlevel)
Applies Newton&#39;s algorithm with line search to solve where .
function modifiedCholesky(in A)
Given a symmetric matrix , provide a lower triangular matrix and a real such that ...
function lineSearch(in obj, in x, in d, in alpha0, in beta1, in beta2, in lambda, in printlevel)
Applies the line search algorithm to find a step along a direction that verifies the Wolfe conditions...
Copyright 2015-2018 Michel Bierlaire