deeplearning-hs-0.1.0.1: Deep Learning in Haskell

PortabilityPOSIX
Stabilityexperimental
Maintainerandrew+cabal@tullo.ch
Safe HaskellNone

DeepLearning.ConvNet

Description

 

Synopsis

Documentation

(>->) :: (Monad m, Shape sh, Shape sh', Shape sh'') => Forward m sh sh' -> Forward m sh' sh'' -> Forward m sh sh''Source

'(>->)' composes two forward activation functions

type DVol sh = Array D sh DoubleSource

Delayed activation matrix

type Forward m sh sh' = Vol sh -> WriterT [Vector Double] m (DVol sh')Source

The Forward function represents a single forward pass through a layer.

class (Shape sh, Shape sh') => InnerLayer a sh sh' | a -> sh, a -> sh'Source

InnerLayer represents an inner layer of a neural network that can accept backpropagation input from higher layers

data SoftMaxLayer Source

SoftMaxLayer computes the softmax activation function.

Constructors

SoftMaxLayer 

class TopLayer a Source

TopLayer is a top level layer that can initialize a backpropagation pass.

type Vol sh = Array U sh DoubleSource

Activation matrix

flowNetwork :: (Monad m, Shape sh) => sh -> Int -> Int -> Int -> Forward m sh DIM1Source

FlowNetwork builds a network of the form

  Input Layer              Output Softmax
     +--+
     |  |   Inner Layers    +--+   +--+
     |  |                   |  |   |  |
     |  |   +-+   +-+  +-+  |  |   |  |
     |  +---+ +---+ +--+ +--+  +--->  |
     |  |   +-+   +-+  +-+  |  |   |  |
     |  |                   |  |   |  |
     |  |                   +--+   +--+
     +--+

net1 :: (Monad m, InnerLayer a sh DIM1, TopLayer a1) => a -> a1 -> Forward m sh DIM1Source

net1 constructs a single-layer fully connected perceptron with softmax output.

net2 :: (Monad m, InnerLayer a sh sh', InnerLayer a1 sh' DIM1, TopLayer a2) => a -> a1 -> a2 -> Forward m sh DIM1Source

net1 constructs a two-layer fully connected MLP with softmax output.

newFC :: Shape sh => sh -> Int -> FullyConnectedLayer shSource

newFC constructs a new fully connected layer

withActivations :: Forward m sh sh' -> Vol sh -> m (DVol sh', [Vector Double])Source

withActivations computes the output activation, along with the intermediate activations