LambdaNet-0.1.0.1: A configurable and extensible neural network library

Safe HaskellNone
LanguageHaskell98

Network.Trainer

Synopsis

Documentation

data BackpropTrainer a Source

Trainer is a typeclass for all trainer types - a trainer will take in an instance of itself, a network, a list of training data, and return a new network trained on the data. class Trainer a where fit :: (Floating b) => a -> Network b -> [TrainingData b] -> Network b

A BackpropTrainer performs simple backpropagation on a neural network. It can be used as the basis for more complex trainers.

Constructors

BackpropTrainer 

Fields

eta :: a
 
cost :: CostFunction a
 
cost' :: CostFunction' a
 

type CostFunction a = Vector a -> Vector a -> a Source

A CostFunction is used for evaluating a network's performance on a given input

type CostFunction' a = Vector a -> Vector a -> Vector a Source

A CostFunction' (derivative) is used in backPropagation

type Selection a = [TrainingData a] -> [[TrainingData a]] Source

A selection function for performing gradient descent

quadraticCost :: (Floating (Vector a), Container Vector a) => Vector a -> Vector a -> a Source

The quadratic cost function (1/2) * sum (y - a) ^ 2

quadraticCost' :: Floating (Vector a) => Vector a -> Vector a -> Vector a Source

The derivative of the quadratic cost function sum (y - a)

minibatch :: (Floating (Vector a), Container Vector a) => Int -> [TrainingData a] -> [[TrainingData a]] Source

The minibatch function becomes a Selection when partially applied with the minibatch size

online :: (Floating (Vector a), Container Vector a) => [TrainingData a] -> [[TrainingData a]] Source

If we want to train the network online

backprop :: (Floating (Vector a), Container Vector a, Product a) => BackpropTrainer a -> Network a -> [TrainingData a] -> Network a Source

Perform backpropagation on a single training data instance.

inputs :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Network a -> [Vector a] Source

The inputs function performs a similar task to outputs, but returns a list of vectors of unactivated inputs

outputs :: (Floating (Vector a), Container Vector a, Product a) => Vector a -> Network a -> [Vector a] Source

The outputs function scans over each layer of the network and stores the activated results

deltas :: (Floating (Vector a), Container Vector a, Product a) => BackpropTrainer a -> Network a -> TrainingData a -> [Vector a] Source

The deltas function returns a list of layer deltas.

fit :: (Floating (Vector a), Container Vector a, Product a) => Selection a -> BackpropTrainer a -> Network a -> [TrainingData a] -> Network a Source

Declare the BackpropTrainer to be an instance of Trainer. instance (Floating a) => Trainer (BackpropTrainer a) where

evaluate :: (Floating (Vector a), Container Vector a, Product a) => BackpropTrainer a -> Network a -> TrainingData a -> a Source

Use the cost function to determine the error of a network