Safe Haskell | Safe-Inferred |
---|
- conjGrad :: (Num a, RealFloat a, Additive f, Metric f) => LineSearch f a -> Beta f a -> (f a -> f a) -> f a -> [f a]
- module Optimization.LineSearch
- type Beta f a = f a -> f a -> f a -> a
- fletcherReeves :: (Num a, RealFloat a, Metric f) => Beta f a
- polakRibiere :: (Num a, RealFloat a, Metric f) => Beta f a
- hestenesStiefel :: (Num a, RealFloat a, Metric f) => Beta f a
Conjugate gradient methods
conjGrad :: (Num a, RealFloat a, Additive f, Metric f) => LineSearch f a -> Beta f a -> (f a -> f a) -> f a -> [f a]Source
Conjugate gradient method with given beta and line search method
The conjugate gradient method avoids the trouble encountered by the
steepest descent method on poorly conditioned problems (e.g. those with
a wide range of eigenvalues). It does this by choosing directions which
satisfy a condition of A
orthogonality, ensuring that steps in the
unstretched search space are orthogonal.
TODO: clarify explanation
General line search
module Optimization.LineSearch
Beta expressions
type Beta f a = f a -> f a -> f a -> aSource
A beta expression beta df0 df1 p
is an expression for the
conjugate direction contribution given the derivative df0
and
direction p
for iteration k
, df1
for iteration k+1