Copyright | (c) Masahiro Sakai 2023 |
---|---|
License | BSD-style |
Maintainer | masahiro.sakai@gmail.com |
Stability | provisional |
Portability | non-portable |
Safe Haskell | Safe-Inferred |
Language | Haskell2010 |
This module is a wrapper of Numeric.Optimization that uses ad's automatic differentiation.
Synopsis
- minimize :: forall f. Traversable f => Method -> Params (f Double) -> (forall s. Reifies s Tape => f (Reverse s Double) -> Reverse s Double) -> Maybe (f (Double, Double)) -> [Constraint] -> f Double -> IO (Result (f Double))
- minimizeReverse :: forall f. Traversable f => Method -> Params (f Double) -> (forall s. Reifies s Tape => f (Reverse s Double) -> Reverse s Double) -> Maybe (f (Double, Double)) -> [Constraint] -> f Double -> IO (Result (f Double))
- minimizeSparse :: forall f. Traversable f => Method -> Params (f Double) -> (forall s. f (AD s (Sparse Double)) -> AD s (Sparse Double)) -> Maybe (f (Double, Double)) -> [Constraint] -> f Double -> IO (Result (f Double))
- data Constraint
- data Method
- isSupportedMethod :: Method -> Bool
- data Params a = Params {}
- data Result a = Result {
- resultSuccess :: Bool
- resultMessage :: String
- resultSolution :: a
- resultValue :: Double
- resultGrad :: Maybe a
- resultHessian :: Maybe (Matrix Double)
- resultHessianInv :: Maybe (Matrix Double)
- resultStatistics :: Statistics
- data Statistics = Statistics {}
- data OptimizationException
- class Default a where
- def :: a
- data AD s a
- auto :: Mode t => Scalar t -> t
- data Reverse s a
- class Reifies (s :: k) a | s -> a
- data Tape
- data Sparse a
Main function
:: forall f. Traversable f | |
=> Method | Numerical optimization algorithm to use |
-> Params (f Double) | Parameters for optimization algorithms. Use |
-> (forall s. Reifies s Tape => f (Reverse s Double) -> Reverse s Double) | Function to be minimized. |
-> Maybe (f (Double, Double)) | Bounds |
-> [Constraint] | Constraints |
-> f Double | Initial value |
-> IO (Result (f Double)) |
Synonym of minimizeReverse
:: forall f. Traversable f | |
=> Method | Numerical optimization algorithm to use |
-> Params (f Double) | Parameters for optimization algorithms. Use |
-> (forall s. Reifies s Tape => f (Reverse s Double) -> Reverse s Double) | Function to be minimized. |
-> Maybe (f (Double, Double)) | Bounds |
-> [Constraint] | Constraints |
-> f Double | Initial value |
-> IO (Result (f Double)) |
Minimization of scalar function of one or more variables.
This is a wrapper of minimize
and use Numeric.AD.Mode.Reverse to compute gradient.
It cannot be used with methods that requires hessian (e.g. Newton
).
Example:
{-# LANGUAGE FlexibleContexts #-} import Numeric.Optimization.AD main :: IO () main = do (x, result, stat) <- minimizeReverse LBFGS def rosenbrock Nothing [] [-3,-4] print (resultSuccess result) -- True print (resultSolution result) -- [0.999999999009131,0.9999999981094296] print (resultValue result) -- 1.8129771632403013e-18 -- https://en.wikipedia.org/wiki/Rosenbrock_function rosenbrock :: Floating a => [a] -> a -- rosenbrock :: Reifies s Tape => [Reverse s Double] -> Reverse s Double rosenbrock [x,y] = sq (1 - x) + 100 * sq (y - sq x) sq :: Floating a => a -> a sq x = x ** 2
:: forall f. Traversable f | |
=> Method | Numerical optimization algorithm to use |
-> Params (f Double) | Parameters for optimization algorithms. Use |
-> (forall s. f (AD s (Sparse Double)) -> AD s (Sparse Double)) | Function to be minimized. |
-> Maybe (f (Double, Double)) | Bounds |
-> [Constraint] | Constraints |
-> f Double | Initial value |
-> IO (Result (f Double)) |
Minimization of scalar function of one or more variables.
This is a wrapper of minimize
and use Numeric.AD.Mode.Sparse to compute gradient
and hessian.
Unlike minimizeReverse
, it can be used with methods that requires hessian (e.g. Newton
).
Example:
{-# LANGUAGE FlexibleContexts #-} import Numeric.Optimization.AD main :: IO () main = do (x, result, stat) <- minimizeSparse Newton def rosenbrock Nothing [] [-3,-4] print (resultSuccess result) -- True print (resultSolution result) -- [0.9999999999999999,0.9999999999999998] print (resultValue result) -- 1.232595164407831e-32 -- https://en.wikipedia.org/wiki/Rosenbrock_function rosenbrock :: Floating a => [a] -> a -- rosenbrock :: [AD s (Sparse Double)] -> AD s (Sparse Double) rosenbrock [x,y] = sq (1 - x) + 100 * sq (y - sq x) sq :: Floating a => a -> a sq x = x ** 2
Problem specification
data Constraint #
Type of constraint
Currently, no constraints are supported.
Algorithm selection
Selection of numerical optimization algorithms
CGDescent | Conjugate gradient method based on Hager and Zhang [1]. The implementation is provided by nonlinear-optimization package [3] which is a binding library of [2]. This method requires gradient but does not require hessian.
|
LBFGS | Limited memory BFGS (L-BFGS) algorithm [1] The implementtion is provided by lbfgs package [2] which is a binding of liblbfgs [3]. This method requires gradient but does not require hessian. |
Newton | Native implementation of Newton method This method requires both gradient and hessian. |
isSupportedMethod :: Method -> Bool #
Whether a Method
is supported under the current environment.
Parameters for optimization algorithms
TODO:
- How to pass algorithm specific parameters?
- Separate
callback
from other more concrete serializeable parameters?
Result
Optimization result
Result | |
|
data OptimizationException #
The bad things that can happen when you use the library.
Instances
Exception OptimizationException | |
Show OptimizationException | |
Defined in Numeric.Optimization showsPrec :: Int -> OptimizationException -> ShowS # show :: OptimizationException -> String # showList :: [OptimizationException] -> ShowS # |
Utilities and Re-exports
A class for types with a default value.
Nothing
Instances
Instances
Mode a => Mode (AD s a) | |
Defined in Numeric.AD.Internal.Type | |
Bounded a => Bounded (AD s a) | |
Enum a => Enum (AD s a) | |
Defined in Numeric.AD.Internal.Type | |
Floating a => Floating (AD s a) | |
RealFloat a => RealFloat (AD s a) | |
Defined in Numeric.AD.Internal.Type floatRadix :: AD s a -> Integer # floatDigits :: AD s a -> Int # floatRange :: AD s a -> (Int, Int) # decodeFloat :: AD s a -> (Integer, Int) # encodeFloat :: Integer -> Int -> AD s a # significand :: AD s a -> AD s a # scaleFloat :: Int -> AD s a -> AD s a # isInfinite :: AD s a -> Bool # isDenormalized :: AD s a -> Bool # isNegativeZero :: AD s a -> Bool # | |
Num a => Num (AD s a) | |
Read a => Read (AD s a) | |
Fractional a => Fractional (AD s a) | |
Real a => Real (AD s a) | |
Defined in Numeric.AD.Internal.Type toRational :: AD s a -> Rational # | |
RealFrac a => RealFrac (AD s a) | |
Show a => Show (AD s a) | |
Erf a => Erf (AD s a) | |
InvErf a => InvErf (AD s a) | |
Eq a => Eq (AD s a) | |
Ord a => Ord (AD s a) | |
type Scalar (AD s a) | |
Defined in Numeric.AD.Internal.Type |
Instances
class Reifies (s :: k) a | s -> a #
Instances
KnownNat n => Reifies (n :: Nat) Integer | |
Defined in Data.Reflection | |
KnownSymbol n => Reifies (n :: Symbol) String | |
Defined in Data.Reflection | |
Reifies Z Int | |
Defined in Data.Reflection | |
Reifies n Int => Reifies (D n :: Type) Int | |
Defined in Data.Reflection | |
Reifies n Int => Reifies (PD n :: Type) Int | |
Defined in Data.Reflection | |
Reifies n Int => Reifies (SD n :: Type) Int | |
Defined in Data.Reflection | |
(B b0, B b1, B b2, B b3, B b4, B b5, B b6, B b7, w0 ~ W b0 b1 b2 b3, w1 ~ W b4 b5 b6 b7) => Reifies (Stable w0 w1 a :: Type) a | |
Defined in Data.Reflection |
We only store partials in sorted order, so the map contained in a partial
will only contain partials with equal or greater keys to that of the map in
which it was found. This should be key for efficiently computing sparse hessians.
there are only n + k - 1
choose k
distinct nth partial derivatives of a
function with k inputs.