- minimize(func, x0, gradient=None, hessian=None, algorithm='default', **args)
- This function is an interface to a variety of algorithms for computing
the minimum of a function of several variables.
INPUT:
- ``func`` -- Either a symbolic function or a Python function whose
argument is a tuple with `n` components
- ``x0`` -- Initial point for finding minimum.
- ``gradient`` -- Optional gradient function. This will be computed
automatically for symbolic functions. For Python functions, it allows
the use of algorithms requiring derivatives. It should accept a
tuple of arguments and return a NumPy array containing the partial
derivatives at that point.
- ``hessian`` -- Optional hessian function. This will be computed
automatically for symbolic functions. For Python functions, it allows
the use of algorithms requiring derivatives. It should accept a tuple
of arguments and return a NumPy array containing the second partial
derivatives of the function.
- ``algorithm`` -- String specifying algorithm to use. Options are
``'default'`` (for Python functions, the simplex method is the default)
(for symbolic functions bfgs is the default):
- ``'simplex'``
- ``'powell'``
- ``'bfgs'`` -- (broyden-fletcher-goldfarb-shannon) requires
``gradient``
- ``'cg'`` -- (conjugate-gradient) requires gradient
- ``'ncg'`` -- (newton-conjugate gradient) requires gradient and hessian
EXAMPLES::
sage: vars=var('x y z')
sage: f=100*(y-x^2)^2+(1-x)^2+100*(z-y^2)^2+(1-y)^2
sage: minimize(f,[.1,.3,.4],disp=0)
(1.00..., 1.00..., 1.00...)
sage: minimize(f,[.1,.3,.4],algorithm="ncg",disp=0)
(0.9999999..., 0.999999..., 0.999999...)
Same example with just Python functions::
sage: def rosen(x): # The Rosenbrock function
... return sum(100.0r*(x[1r:]-x[:-1r]**2.0r)**2.0r + (1r-x[:-1r])**2.0r)
sage: minimize(rosen,[.1,.3,.4],disp=0)
(1.00..., 1.00..., 1.00...)
Same example with a pure Python function and a Python function to
compute the gradient::
sage: def rosen(x): # The Rosenbrock function
... return sum(100.0r*(x[1r:]-x[:-1r]**2.0r)**2.0r + (1r-x[:-1r])**2.0r)
sage: import numpy
sage: from numpy import zeros
sage: def rosen_der(x):
... xm = x[1r:-1r]
... xm_m1 = x[:-2r]
... xm_p1 = x[2r:]
... der = zeros(x.shape,dtype=float)
... der[1r:-1r] = 200r*(xm-xm_m1**2r) - 400r*(xm_p1 - xm**2r)*xm - 2r*(1r-xm)
... der[0] = -400r*x[0r]*(x[1r]-x[0r]**2r) - 2r*(1r-x[0])
... der[-1] = 200r*(x[-1r]-x[-2r]**2r)
... return der
sage: minimize(rosen,[.1,.3,.4],gradient=rosen_der,algorithm="bfgs",disp=0)
(1.00..., 1.00..., 1.00...)