Optimization: principles and algorithms, by Michel Bierlaire
constrainedNewton.m
Go to the documentation of this file.
1 %> \file
2 %> Algorithm 17.2: preconditioned projected gradient, or constrained Newton. Implementation of algorithm 17.2 of \cite Bier15-book
3 %>
4 %> @note Tested with \ref run1702constrainedNewton.m
5 %>
6 %> @author Michel Bierlaire
7 %> @date Sun Mar 22 16:25:39 2015
8 %> @ingroup Algorithms
9 %> @ingroup chap17
10
11 %> Applies the projected gradient method to solve \f$\min_x f(x) \f$ subject to \f$Ax = b\f$
12 %> @param obj the name of the Octave function defining \f$f(x)\f$ and \f$\nabla f(x)\f$.
13 %> @param A matrix of the constraint
14 %> @param b right-hand side of the constraint
15 %> @param x0 starting point
16 %> @param eps algorithm stops if \f$\|d_k\| \leq \varepsilon \f$.
17 %> @param gamma parameter > 0 (default: 1)
18 %> @param maxiter maximum number of iterations (default: 100)
19 %> @return [solution,iteres,niter]
20 %> @return solution: local minimum of the function
21 %> @return iteres: sequence of iterates generated by the algorithm. It contains n+2 columns. Columns 1:n contains the value of the current iterate. Column n+1 contains the value of the objective function. Column n+2 contains the value of the norm of the gradient. It contains maxiter rows, but only the first niter rows are meaningful.
22 %> @return niter: total number of iterations
23 function [solution, iteres, niter] = constrainedNewton(obj,A,b,x0,eps,gamma=1,maxiter=100)
25  xk = x0 ;
26  n = size(x0) ;
27  [f,g,H] = feval(obj,xk) ;
28  iteres = zeros(1+ maxiter,4) ;
29  k = 0 ;
30  do
31  [L, tau] = modifiedCholesky(H) ;
32  dk = qp(xk, L*L', gamma*g, A, b-A*xk) ;
33  iteres(k+1,:) = [xk' f norm(dk) ] ;
34  if (norm(dk) > eps)
35  alpha = lineSearch(obj,xk,dk,1,1.0e-4,0.99,2,0) ;
36  xk = xk + alpha * dk ;
37  [f,g,H] = feval(obj,xk);
38  k=k+1 ;
39  endif
40  until (norm(dk) <= eps || k >= maxiter)
41  solution = xk ;
42  niter = k+1 ;
43 endfunction
function modifiedCholesky(in A)
Given a symmetric matrix , provide a lower triangular matrix and a real such that ...
function constrainedNewton(in obj, in A, in b, in x0, in eps, in gamma)
Applies the projected gradient method to solve subject to .
function lineSearch(in obj, in x, in d, in alpha0, in beta1, in beta2, in lambda, in printlevel)
Applies the line search algorithm to find a step along a direction that verifies the Wolfe conditions...