Optimization: principles and algorithms, by Michel Bierlaire
newtonLocal.m
Go to the documentation of this file.
1 %> \file
2 %> Algorithm 10.1: Local Newton for optimization. Implementation of algorithm 10.1 of \cite Bier15-book
3 %>
4 %> @author <a href="http://people.epfl.ch/michel.bierlaire">Michel Bierlaire</a>
5 %> @date Fri Mar 20 15:41:01 2015
6 %> @ingroup Algorithms
7 %> @ingroup chap10
8 
9 %> @note Tested with \ref run0508.m
10 
11 %> Applies local Newton algorithm to solve \f$\nabla f(x)=0\f$ where \f$\nabla f:\mathbb{R}^n\to\mathbb{R}^n \f$ is the gradient of the objective function.
12 %> @param obj the name of the Octave function defining \f$\nabla f(x)\f$ and the hessian \f$\nabla^2 f(x)\f$
13 %> @param x0 the starting point
14 %> @param eps algorithm stops if \f$\|F(x)\| \leq \varepsilon \f$.
15 %> @param maxiter maximum number of iterations (Default: 100)
16 %> @return [solution,f]
17 %> @return solution: root of the function
18 %> @return f: value of F at the solution
19 function [solution,g] = newtonLocal(obj,x0,eps,maxiter=100)
20  xk = x0 ;
21  n = size(x0,1) ;
22  [g,H] = feval(obj,xk) ;
23  k = 0 ;
24  printf("%d\t%e\t%e\t%e\n",k,xk(1),g(1),norm(g))
25  for i=2:n
26  printf("\t%e\t%e\t\n",xk(i),g(i))
27  endfor
28  do
29  xk = xk - H \ g ;
30  [g,H] = feval(obj,xk) ;
31  k=k+1;
32  printf("%d\t%e\t%e\t%e\n",k,xk(1),g(1),norm(g))
33  for i=2:n
34  printf("\t%e\t%e\t\n",xk(i),g(i))
35  endfor
36  until (norm(g) <= eps || k >= maxiter)
37  solution = xk ;
38 endfunction
function newtonLocal(in obj, in x0, in eps, in maxiter)
Applies local Newton algorithm to solve where is the gradient of the objective function.
Copyright 2015-2018 Michel Bierlaire