inf-backprop: Automatic differentiation and backpropagation.
Automatic differentiation and backpropagation.
We do not attract gradient tape.
Instead, the differentiation operator is defined directly as a map between differentiable function objects.
Such functions are to be combined in arrow style using (>>>)
, (***)
, first
, etc.
The original purpose of the package is an automatic backpropagation differentiation component for a functional type-dependent library for deep machine learning. See tutorial details.
Modules
[Index] [Quick Jump]
Downloads
- inf-backprop-0.1.0.2.tar.gz [browse] (Cabal source package)
- Package description (as included in the package)
Maintainer's Corner
For package maintainers and hackage trustees
Candidates
Versions [RSS] | 0.1.0.0, 0.1.0.1, 0.1.0.2 |
---|---|
Change log | CHANGELOG.md |
Dependencies | base (>=4.7 && <5), comonad, isomorphism-class, monad-logger, numhask, simple-expr, text, transformers [details] |
License | BSD-3-Clause |
Copyright | 2023 Alexey Tochin |
Author | Alexey Tochin |
Maintainer | Alexey.Tochin@gmail.com |
Category | Mathematics |
Uploaded | by AlexeyTochin at 2023-05-13T12:43:14Z |
Distributions | LTSHaskell:0.1.0.2, NixOS:0.1.0.2, Stackage:0.1.0.2 |
Downloads | 237 total (19 in the last 30 days) |
Rating | 2.0 (votes: 1) [estimated by Bayesian average] |
Your Rating | |
Status | Docs uploaded by user Build status unknown [no reports yet] |