Safe Haskell | None |
---|
Stochastic gradient descent implementation using mutable vectors for efficient update of the parameters vector. A user is provided with the immutable vector of parameters so he is able to compute the gradient outside of the IO monad. Currently only the Gaussian priors are implemented.
This is a preliminary version of the SGD library and API may change in future versions.
Documentation
SGD parameters controlling the learning process.
sgdArgsDefault :: SgdArgsSource
Default SGD parameter values.
:: SgdArgs | SGD parameter values |
-> (Para -> Int -> IO ()) | Notification run every update |
-> (Para -> x -> Grad) | Gradient for dataset element |
-> Dataset x | Dataset |
-> Para | Starting point |
-> IO Para | SGD result |
A stochastic gradient descent method. A notification function can be used to provide user with information about the progress of the learning.
module Numeric.SGD.Grad
module Numeric.SGD.Dataset