| Copyright | (c) Lars Brünjes, 2016 |
|---|---|
| License | MIT |
| Maintainer | brunjlar@gmail.com |
| Stability | experimental |
| Portability | portable |
| Safe Haskell | None |
| Language | Haskell2010 |
| Extensions |
|
Numeric.Neural.Layer
Description
This modules defines special "layer" components and convenience functions for the creation of such layers.
- type Layer i o = Component (Vector i) (Vector o)
- linearLayer :: forall i o. (KnownNat i, KnownNat o) => Layer i o
- layer :: (KnownNat i, KnownNat o) => Diff' -> Layer i o
- tanhLayer :: (KnownNat i, KnownNat o) => Layer i o
- logisticLayer :: (KnownNat i, KnownNat o) => Layer i o
- reLULayer :: (KnownNat i, KnownNat o) => Layer i o
- softmax :: (Floating a, Functor f, Foldable f) => f a -> f a
Documentation
linearLayer :: forall i o. (KnownNat i, KnownNat o) => Layer i o Source #
Creates a linear Layer, i.e. a layer that multiplies the input with a weight Matrix and adds a bias to get the output.
Random initialization follows the recommendation from chapter 3 of the online book Neural Networks and Deep Learning by Michael Nielsen.
layer :: (KnownNat i, KnownNat o) => Diff' -> Layer i o Source #
Creates a Layer as a combination of a linear layer and a non-linear activation function.
logisticLayer :: (KnownNat i, KnownNat o) => Layer i o Source #
This is a simple Layer, specialized to the logistic function as activation. Output values are all in the interval [-1,1].