Copyright | (c) Masahiro Sakai 2023 |
---|---|
License | BSD-style |
Maintainer | masahiro.sakai@gmail.com |
Stability | provisional |
Portability | non-portable |
Safe Haskell | Safe-Inferred |
Language | Haskell2010 |
This module aims to provides unifined interface to various numerical optimization, like scipy.optimize in Python.
In this module, you need to explicitly provide the function to calculate the
gradient, -- but you can use numeric-optimization-ad
or
numeric-optimization-backprop
to define it using automatic differentiation.
Synopsis
- minimize :: forall prob. (IsProblem prob, Optionally (HasGrad prob), Optionally (HasHessian prob)) => Method -> Params (Vector Double) -> prob -> Vector Double -> IO (Result (Vector Double))
- class IsProblem prob where
- class IsProblem prob => HasGrad prob where
- class IsProblem prob => HasHessian prob where
- data Constraint
- boundsUnconstrained :: Int -> Vector (Double, Double)
- isUnconstainedBounds :: Vector (Double, Double) -> Bool
- data WithGrad prob = WithGrad prob (Vector Double -> Vector Double)
- data WithHessian prob = WithHessian prob (Vector Double -> Matrix Double)
- data WithBounds prob = WithBounds prob (Vector (Double, Double))
- data WithConstraints prob = WithConstraints prob [Constraint]
- data Method
- isSupportedMethod :: Method -> Bool
- data Params a = Params {}
- data Result a = Result {
- resultSuccess :: Bool
- resultMessage :: String
- resultSolution :: a
- resultValue :: Double
- resultGrad :: Maybe a
- resultHessian :: Maybe (Matrix Double)
- resultHessianInv :: Maybe (Matrix Double)
- resultStatistics :: Statistics
- data Statistics = Statistics {}
- data OptimizationException
- class Default a where
- def :: a
- class Optionally c where
- optionalDict :: Maybe (Dict c)
- hasOptionalDict :: c => Maybe (Dict c)
Main function
:: forall prob. (IsProblem prob, Optionally (HasGrad prob), Optionally (HasHessian prob)) | |
=> Method | Numerical optimization algorithm to use |
-> Params (Vector Double) | Parameters for optimization algorithms. Use |
-> prob | Optimization problem to solve |
-> Vector Double | Initial value |
-> IO (Result (Vector Double)) |
Minimization of scalar function of one or more variables.
This function is intended to provide functionality similar to Python's scipy.optimize.minimize
.
Example:
{-# LANGUAGE OverloadedLists #-} import Data.Vector.Storable (Vector) import Numeric.Optimization main :: IO () main = do (x, result, stat) <- minimize LBFGS def (WithGrad rosenbrock rosenbrock') [-3,-4] print (resultSuccess result) -- True print (resultSolution result) -- [0.999999999009131,0.9999999981094296] print (resultValue result) -- 1.8129771632403013e-18 -- https://en.wikipedia.org/wiki/Rosenbrock_function rosenbrock :: Vector Double -> Double rosenbrock [x,y] = sq (1 - x) + 100 * sq (y - sq x) rosenbrock' :: Vector Double -> Vector Double rosenbrock' [x,y] = [ 2 * (1 - x) * (-1) + 100 * 2 * (y - sq x) * (-2) * x , 100 * 2 * (y - sq x) ] sq :: Floating a => a -> a sq x = x ** 2
Problem specification
Problems are specified by types of IsProblem
type class.
In the simplest case,
is a instance
of Vector
Double -> DoubleIsProblem
class. It is enough if your problem does not have
constraints and the selected algorithm does not further information
(e.g. gradients and hessians),
You can equip a problem with other information using wrapper types:
If you need further flexibility or efficient implementation, you can
define instance of IsProblem
by yourself.
class IsProblem prob where Source #
Optimization problems
func :: prob -> Vector Double -> Double Source #
Objective function
It is called fun
in scipy.optimize.minimize
.
bounds :: prob -> Maybe (Vector (Double, Double)) Source #
Bounds
constraints :: prob -> [Constraint] Source #
Constraints
Instances
IsProblem prob => IsProblem (WithBounds prob) Source # | |
Defined in Numeric.Optimization func :: WithBounds prob -> Vector Double -> Double Source # bounds :: WithBounds prob -> Maybe (Vector (Double, Double)) Source # constraints :: WithBounds prob -> [Constraint] Source # | |
IsProblem prob => IsProblem (WithConstraints prob) Source # | |
Defined in Numeric.Optimization func :: WithConstraints prob -> Vector Double -> Double Source # bounds :: WithConstraints prob -> Maybe (Vector (Double, Double)) Source # constraints :: WithConstraints prob -> [Constraint] Source # | |
IsProblem prob => IsProblem (WithGrad prob) Source # | |
IsProblem prob => IsProblem (WithHessian prob) Source # | |
Defined in Numeric.Optimization func :: WithHessian prob -> Vector Double -> Double Source # bounds :: WithHessian prob -> Maybe (Vector (Double, Double)) Source # constraints :: WithHessian prob -> [Constraint] Source # | |
IsProblem (Vector Double -> Double) Source # | |
class IsProblem prob => HasGrad prob where Source #
Optimization problem equipped with gradient information
grad :: prob -> Vector Double -> Vector Double Source #
Gradient of a function computed by func
It is called jac
in scipy.optimize.minimize
.
grad' :: prob -> Vector Double -> (Double, Vector Double) Source #
grad'M :: PrimMonad m => prob -> Vector Double -> MVector (PrimState m) Double -> m Double Source #
Similar to grad'
but destination passing style is used for gradient vector
Instances
HasGrad prob => HasGrad (WithBounds prob) Source # | |
HasGrad prob => HasGrad (WithConstraints prob) Source # | |
Defined in Numeric.Optimization | |
IsProblem prob => HasGrad (WithGrad prob) Source # | |
HasGrad prob => HasGrad (WithHessian prob) Source # | |
Optionally (HasGrad prob) => Optionally (HasGrad (WithBounds prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasGrad (WithBounds prob))) Source # | |
Optionally (HasGrad prob) => Optionally (HasGrad (WithConstraints prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasGrad (WithConstraints prob))) Source # | |
IsProblem prob => Optionally (HasGrad (WithGrad prob)) Source # | |
Defined in Numeric.Optimization | |
Optionally (HasGrad prob) => Optionally (HasGrad (WithHessian prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasGrad (WithHessian prob))) Source # | |
Optionally (HasGrad (Vector Double -> Double)) Source # | |
Defined in Numeric.Optimization |
class IsProblem prob => HasHessian prob where Source #
Optimization problem equipped with hessian information
hessian :: prob -> Vector Double -> Matrix Double Source #
Hessian of a function computed by func
It is called hess
in scipy.optimize.minimize
.
hessianProduct :: prob -> Vector Double -> Vector Double -> Vector Double Source #
The product of the hessian H
of a function f
at x
with a vector x
.
It is called hessp
in scipy.optimize.minimize
.
See also https://hackage.haskell.org/package/ad-4.5.4/docs/Numeric-AD.html#v:hessianProduct.
Instances
data Constraint Source #
Type of constraint
Currently, no constraints are supported.
boundsUnconstrained :: Int -> Vector (Double, Double) Source #
Bounds for unconstrained problems, i.e. (-∞,+∞).
isUnconstainedBounds :: Vector (Double, Double) -> Bool Source #
Whether all lower bounds are -∞ and all upper bounds are +∞.
Wrapper types
Wrapper type for adding gradient function to a problem
Instances
IsProblem prob => HasGrad (WithGrad prob) Source # | |
HasHessian prob => HasHessian (WithGrad prob) Source # | |
IsProblem prob => IsProblem (WithGrad prob) Source # | |
IsProblem prob => Optionally (HasGrad (WithGrad prob)) Source # | |
Defined in Numeric.Optimization | |
Optionally (HasHessian prob) => Optionally (HasHessian (WithGrad prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasHessian (WithGrad prob))) Source # |
data WithHessian prob Source #
Wrapper type for adding hessian to a problem
WithHessian prob (Vector Double -> Matrix Double) |
Instances
HasGrad prob => HasGrad (WithHessian prob) Source # | |
IsProblem prob => HasHessian (WithHessian prob) Source # | |
Defined in Numeric.Optimization | |
IsProblem prob => IsProblem (WithHessian prob) Source # | |
Defined in Numeric.Optimization func :: WithHessian prob -> Vector Double -> Double Source # bounds :: WithHessian prob -> Maybe (Vector (Double, Double)) Source # constraints :: WithHessian prob -> [Constraint] Source # | |
Optionally (HasGrad prob) => Optionally (HasGrad (WithHessian prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasGrad (WithHessian prob))) Source # | |
IsProblem prob => Optionally (HasHessian (WithHessian prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasHessian (WithHessian prob))) Source # |
data WithBounds prob Source #
Wrapper type for adding bounds to a problem
WithBounds prob (Vector (Double, Double)) |
Instances
HasGrad prob => HasGrad (WithBounds prob) Source # | |
HasHessian prob => HasHessian (WithBounds prob) Source # | |
Defined in Numeric.Optimization | |
IsProblem prob => IsProblem (WithBounds prob) Source # | |
Defined in Numeric.Optimization func :: WithBounds prob -> Vector Double -> Double Source # bounds :: WithBounds prob -> Maybe (Vector (Double, Double)) Source # constraints :: WithBounds prob -> [Constraint] Source # | |
Optionally (HasGrad prob) => Optionally (HasGrad (WithBounds prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasGrad (WithBounds prob))) Source # | |
Optionally (HasHessian prob) => Optionally (HasHessian (WithBounds prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasHessian (WithBounds prob))) Source # |
data WithConstraints prob Source #
Wrapper type for adding constraints to a problem
WithConstraints prob [Constraint] |
Instances
HasGrad prob => HasGrad (WithConstraints prob) Source # | |
Defined in Numeric.Optimization | |
HasHessian prob => HasHessian (WithConstraints prob) Source # | |
Defined in Numeric.Optimization | |
IsProblem prob => IsProblem (WithConstraints prob) Source # | |
Defined in Numeric.Optimization func :: WithConstraints prob -> Vector Double -> Double Source # bounds :: WithConstraints prob -> Maybe (Vector (Double, Double)) Source # constraints :: WithConstraints prob -> [Constraint] Source # | |
Optionally (HasGrad prob) => Optionally (HasGrad (WithConstraints prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasGrad (WithConstraints prob))) Source # | |
Optionally (HasHessian prob) => Optionally (HasHessian (WithConstraints prob)) Source # | |
Defined in Numeric.Optimization optionalDict :: Maybe (Dict (HasHessian (WithConstraints prob))) Source # |
Algorithm selection
Selection of numerical optimization algorithms
CGDescent | Conjugate gradient method based on Hager and Zhang [1]. The implementation is provided by nonlinear-optimization package [3] which is a binding library of [2]. This method requires gradient but does not require hessian.
|
LBFGS | Limited memory BFGS (L-BFGS) algorithm [1] The implementtion is provided by lbfgs package [2] which is a binding of liblbfgs [3]. This method requires gradient but does not require hessian. |
Newton | Native implementation of Newton method This method requires both gradient and hessian. |
isSupportedMethod :: Method -> Bool Source #
Whether a Method
is supported under the current environment.
Parameters for optimization algorithms
TODO:
- How to pass algorithm specific parameters?
- Separate
callback
from other more concrete serializeable parameters?
Result
Optimization result
Result | |
|
data OptimizationException Source #
The bad things that can happen when you use the library.
Instances
Exception OptimizationException Source # | |
Show OptimizationException Source # | |
Defined in Numeric.Optimization showsPrec :: Int -> OptimizationException -> ShowS # show :: OptimizationException -> String # showList :: [OptimizationException] -> ShowS # |
Utilities and Re-export
A class for types with a default value.
Nothing
Instances
class Optionally c where Source #
Optional constraint
optionalDict :: Maybe (Dict c) Source #
Instances
hasOptionalDict :: c => Maybe (Dict c) Source #
Utility function to define Optionally
instances