Optimization: principles and algorithms, by Michel Bierlaire
ex1905.m
Go to the documentation of this file.
1 %> \file
2 %> \f[\min f(x)=2(x_1^2+x_2^2-1)-x_1\f] subject to \f[x_1^2 + x_2^2 = 1 \f]
3 %> @author <a href="http://people.epfl.ch/michel.bierlaire">Michel Bierlaire</a>
4 %> @date Mon Mar 23 15:42:51 2015
5 %> @ingroup Examples
6 %> @ingroup chap19
7 
8 %> @param x value of the variables
9 %> @param index If 0, the objective function is evaluated. If not, the constraint number index is evaluated.
10 %> @return f value of the function
11 %> @return g value of the gradient
12 %> @return H value of the hessian
13 function [f,g,H] = ex1905(x,index)
14  if (index == 0)
15  f = 2.0 * (x(1) * x(1) + x(2) * x(2) - 1.0 ) - x(1) ;
16  g = [ 4 * x(1) - 1.0 ; 4 * x(2) ] ;
17  H = [ 4 0 ; 0 4] ;
18  return ;
19  endif
20  if (index == 1)
21  f = x(1) * x(1) + x(2) * x(2) - 1.0 ;
22  g = [ 2.0 * x(1) ; 2.0 * x(2) ] ;
23  H = [2 0 ; 0 2] ;
24  return
25  endif
26  error("There is only one constraint") ;
27 endfunction
function ex1905(in x, in index)
Copyright 2015-2018 Michel Bierlaire