HaskellNN-0.1.3: High Performance Neural Network in Haskell

MaintainerKiet Lam <ktklam9@gmail.com>

AI.Calculation.Gradients

Description

This module represents ways to calculate the gradient vector of the weights of the neural network

Backpropagation should always be preferred over the Numerical Gradient method

Synopsis

Documentation

backpropagation :: GradientFunctionSource

Calculate the analytical gradient of the weights of the network by using backpropagation

numericalGradientsSource

Arguments

:: Double

The epsilon

-> GradientFunction

Returns a gradient function that calculates the numerical gradients of the weights

NOTE: This should only be used as a last resort if for some reason (bugs?) the backpropagation algorithm does not give you good gradients

The numerical algorithm requires two forward propagations, while the backpropagation algorithm only requires one, so this is more costly

Also, analytical gradients almost always perform better than numerical gradients

User must provide an epsilon value. Make sure to use a very small value for the epsilon for more accurate gradients