Copyright | (c) Tom Harding 2020 |
---|---|
License | MIT |
Safe Haskell | None |
Language | Haskell2010 |
This module includes almost everything you'd need to build a constraint-solving
computation. The module uses the Holmes
solver, but you may want to use the
functions in the Control.Monad.Watson module to avoid executing your code in
IO
.
Synopsis
- data Holmes (x :: Type)
- class Monad m => MonadCell (m :: Type -> Type)
- forward :: (Typeable x, Merge x, Merge y) => (forall m. MonadCell m => Prop m x -> Prop m y) -> x -> Maybe y
- backward :: (Typeable x, Merge x, Merge y) => (forall m. MonadCell m => Prop m x -> Prop m y) -> y -> Maybe x
- satisfying :: (EqR x b, Typeable x) => Config Holmes x -> (forall m. MonadCell m => [Prop m x] -> Prop m b) -> IO (Maybe [x])
- shuffle :: Config Holmes x -> Config Holmes x
- whenever :: (EqR x b, Typeable x) => Config Holmes x -> (forall m. MonadCell m => [Prop m x] -> Prop m b) -> IO [[x]]
- data Config (m :: Type -> Type) (x :: Type) = Config {}
- class Input (x :: Type) where
- permute :: (Applicative m, Eq x, Hashable x) => Config m x -> m (HashSet [x])
- class Merge x => AbsR (x :: Type) where
- absR :: (x, x) -> (x, x)
- class Merge x => BooleanR (x :: Type) where
- class (BooleanR b, Merge x) => EqR (x :: Type) (b :: Type) | x -> b where
- eqR :: (x, x, b) -> (x, x, b)
- neR :: EqR x b => (x, x, b) -> (x, x, b)
- class Zipping f c => FlatMapping (f :: Type -> Type) (c :: Type -> Constraint) | f -> c where
- flatMapR :: (c x, c y) => ((x, f y) -> (x, f y)) -> (f x, f y) -> (f x, f y)
- class SumR x => FractionalR (x :: Type) where
- multiplyR :: (x, x, x) -> (x, x, x)
- class SumR x => IntegralR (x :: Type) where
- divModR :: (x, x, x, x) -> (x, x, x, x)
- class (forall x. c x => Merge (f x)) => Mapping (f :: Type -> Type) (c :: Type -> Constraint) | f -> c where
- mapR :: (c x, c y) => ((x, y) -> (x, y)) -> (f x, f y) -> (f x, f y)
- class EqR x b => OrdR (x :: Type) (b :: Type) | x -> b where
- lteR :: (x, x, b) -> (x, x, b)
- ltR :: OrdR x b => (x, x, b) -> (x, x, b)
- gtR :: OrdR x b => (x, x, b) -> (x, x, b)
- gteR :: OrdR x b => (x, x, b) -> (x, x, b)
- class Merge x => SumR (x :: Type) where
- addR :: (x, x, x) -> (x, x, x)
- negateR :: (Num x, SumR x) => (x, x) -> (x, x)
- subR :: SumR x => (x, x, x) -> (x, x, x)
- class Mapping f c => Zipping (f :: Type -> Type) (c :: Type -> Constraint) | f -> c where
- zipWithR :: (c x, c y, c z) => ((x, y, z) -> (x, y, z)) -> (f x, f y, f z) -> (f x, f y, f z)
- class Monoid x => Merge (x :: Type) where
- data Result (x :: Type)
- data Defined (x :: Type)
- newtype Intersect (x :: Type) = Intersect {}
- using :: (Applicative m, Intersectable x) => [Intersect x] -> Config m (Intersect x)
- data Prop (m :: Type -> Type) (content :: Type)
- (.$) :: (Mapping f c, c x, c y) => (x -> y) -> Prop m (f x) -> Prop m (f y)
- (.>>=) :: (FlatMapping f c, c x, c y) => Prop m (f x) -> (x -> f y) -> Prop m (f y)
- zipWith' :: (Zipping f c, c x, c y, c z) => ((x, y, z) -> (x, y, z)) -> Prop m (f x) -> Prop m (f y) -> Prop m (f z)
- (.&&) :: BooleanR b => Prop m b -> Prop m b -> Prop m b
- all' :: (BooleanR b, MonadCell m) => (x -> Prop m b) -> [x] -> Prop m b
- allWithIndex' :: (BooleanR b, MonadCell m) => (Int -> x -> Prop m b) -> [x] -> Prop m b
- and' :: (BooleanR b, MonadCell m) => [Prop m b] -> Prop m b
- (.||) :: BooleanR b => Prop m b -> Prop m b -> Prop m b
- any' :: (BooleanR b, MonadCell m) => (x -> Prop m b) -> [x] -> Prop m b
- anyWithIndex' :: (BooleanR b, MonadCell m) => (Int -> x -> Prop m b) -> [x] -> Prop m b
- or' :: (BooleanR b, MonadCell m) => [Prop m b] -> Prop m b
- not' :: (BooleanR b, MonadCell m) => Prop m b -> Prop m b
- false :: (BooleanR b, MonadCell m) => Prop m b
- true :: (BooleanR b, MonadCell m) => Prop m b
- (.*) :: (FractionalR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x
- (./) :: (FractionalR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x
- (.+) :: (SumR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x
- (.-) :: (SumR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x
- (.<) :: (OrdR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b
- (.<=) :: (OrdR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b
- (.>) :: (OrdR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b
- (.>=) :: (OrdR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b
- (.==) :: (EqR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b
- (./=) :: (EqR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b
- distinct :: (EqR x b, MonadCell m) => [Prop m x] -> Prop m b
- (.%.) :: (IntegralR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x
- (.*.) :: (Num x, IntegralR x) => Prop m x -> Prop m x -> Prop m x
- (./.) :: (IntegralR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x
- abs' :: (AbsR x, MonadCell m) => Prop m x -> Prop m x
- negate' :: (Num x, SumR x, MonadCell m) => Prop m x -> Prop m x
- recip' :: (Num x, FractionalR x, MonadCell m) => Prop m x -> Prop m x
Documentation
data Holmes (x :: Type) Source #
A monad capable of solving constraint problems using IO
as the
evaluation type. Cells are represented using IORef
references,
and provenance is tracked to optimise backtracking search across
multiple branches.
class Monad m => MonadCell (m :: Type -> Type) Source #
The DSL for network construction primitives. The following interface provides the building blocks upon which the rest of the library is constructed.
If you are looking to implement the class yourself, you should note the lack of functionality for ambiguity/searching. This is deliberate: for backtracking search (as opposed to truth maintenance-based approaches), the ability to create computation branches dynamically makes it much harder to establish a reliable mechanism for tracking the effects of these choices.
For example: the approach used in the MoriarT
implementation is to separate the introduction of ambiguity into one
definite, explicit step, and all parameters must be declared ahead of time
so that they can be assigned indices. Other implementations should feel free
to take other approaches, but these will be implementation-specific.
Instances
MonadCell Holmes Source # | |
Defined in Control.Monad.Holmes | |
MonadCell (Watson h) Source # | |
Defined in Control.Monad.Watson |
forward :: (Typeable x, Merge x, Merge y) => (forall m. MonadCell m => Prop m x -> Prop m y) -> x -> Maybe y Source #
Run a function between propagators with a raw value, writing the given value to the "input" cell and reading the result from the "output" cell.
backward :: (Typeable x, Merge x, Merge y) => (forall m. MonadCell m => Prop m x -> Prop m y) -> y -> Maybe x Source #
Run a function between propagators "backwards", writing the given value as the output and then trying to push information backwards to the input cell.
satisfying :: (EqR x b, Typeable x) => Config Holmes x -> (forall m. MonadCell m => [Prop m x] -> Prop m b) -> IO (Maybe [x]) Source #
Given an input configuration, and a predicate on those input variables, return the first configuration that satisfies the predicate.
shuffle :: Config Holmes x -> Config Holmes x Source #
Shuffle the refinements in a configuration. If we make a configuration
like 100
, the first configuration will be one hundred from
[1 .. 10]1
values. Sometimes, we might find we get to a first solution faster by
randomising the order in which refinements are given. This is similar to the
"random restart" strategy in hill-climbing problems.
Another nice use for this function is procedural generation: often, your results will look more "natural" if you introduce an element of randomness.
whenever :: (EqR x b, Typeable x) => Config Holmes x -> (forall m. MonadCell m => [Prop m x] -> Prop m b) -> IO [[x]] Source #
Given an input configuration, and a predicate on those input variables, return all configurations that satisfy the predicate. It should be noted that there's nothing lazy about this; if your problem has a lot of solutions, or your search space is very big, you'll be waiting a long time!
data Config (m :: Type -> Type) (x :: Type) Source #
An input configuration.
This stores both an initial
configuration of input parameters, as well as
a function that can look for ways to refine
an input. In other words, if
the initial value is an Data.JoinSemilattice.Intersect of [1 .. 5]
, the
refinements might be singleton
values of
every remaining possibility.
class Input (x :: Type) where Source #
The simplest way of generating an input configuration is to say that a
problem has m
variables that will all be one of n
possible values. For
example, a sudoku board is 81
variables of 9
possible values. This class
allows us to generate these simple input configurations like a game of
countdown: "81
from 1 .. 9
, please, Carol!"
Different parameter types will have different representations for their
values. The Raw
type means that I can say 81
, and have
the parameter type determine how it will represent from
[1 .. 9]1
, for example. It's
a little bit of syntactic sugar for the benefit of the user, so they don't
need to know as much about how the parameter types work to use the
library.
class Merge x => AbsR (x :: Type) where Source #
Unlike the abs
we know, which is a function from a value to its
absolute value, absR
is a relationship between a value and its absolute.
For some types, while we can't truly reverse the abs
function, we can say
that there are two possible inputs to consider, and so we can push some
information in the reverse direction.
Nothing
absR :: (x, x) -> (x, x) Source #
Given a value and its absolute, try to learn something in either direction.
absR :: Num x => (x, x) -> (x, x) Source #
Given a value and its absolute, try to learn something in either direction.
class Merge x => BooleanR (x :: Type) where Source #
Rather than the not
, and
, and or
functions we know and love, the
BooleanR
class presents relationships that are analogous to these. The
main difference is that relationships are not one-way. For example, if I
tell you that the output of x && y
is True
, you can tell me what the
inputs are, even if your computer can't. The implementations of BooleanR
should be such that all directions of inference are considered.
An overloaded False
value.
An overloaded True
value.
notR :: (x, x) -> (x, x) Source #
A relationship between a boolean value and its opposite.
andR :: (x, x, x) -> (x, x, x) Source #
A relationship between two boolean values and their conjunction.
orR :: (x, x, x) -> (x, x, x) Source #
A relationship between two boolean values and their disjunction.
Instances
BooleanR (Defined Bool) Source # | |
Defined in Data.JoinSemilattice.Class.Boolean falseR :: Defined Bool Source # trueR :: Defined Bool Source # notR :: (Defined Bool, Defined Bool) -> (Defined Bool, Defined Bool) Source # andR :: (Defined Bool, Defined Bool, Defined Bool) -> (Defined Bool, Defined Bool, Defined Bool) Source # orR :: (Defined Bool, Defined Bool, Defined Bool) -> (Defined Bool, Defined Bool, Defined Bool) Source # | |
BooleanR (Intersect Bool) Source # | |
Defined in Data.JoinSemilattice.Class.Boolean falseR :: Intersect Bool Source # trueR :: Intersect Bool Source # notR :: (Intersect Bool, Intersect Bool) -> (Intersect Bool, Intersect Bool) Source # andR :: (Intersect Bool, Intersect Bool, Intersect Bool) -> (Intersect Bool, Intersect Bool, Intersect Bool) Source # orR :: (Intersect Bool, Intersect Bool, Intersect Bool) -> (Intersect Bool, Intersect Bool, Intersect Bool) Source # |
class (BooleanR b, Merge x) => EqR (x :: Type) (b :: Type) | x -> b where Source #
Equality between two variables as a relationship between them and their
result. The hope here is that, if we learn the output before the inputs, we
can often "work backwards" to learn something about them. If we know the
result is exactly true, for example, we can effectively then
unify
the two input cells, as we know that their
values will always be the same.
neR :: EqR x b => (x, x, b) -> (x, x, b) Source #
A relationship between two variables and the result of a not-equals comparison between them.
class Zipping f c => FlatMapping (f :: Type -> Type) (c :: Type -> Constraint) | f -> c where Source #
Some types, such as Intersect
, contain multiple "candidate values". This
function allows us to take each candidate, apply a function, and then
union all the results. Perhaps fanOut
would have been a better name for
this function, but we use `(>>=)` to lend an intuition when we lift this
into Prop
via `(Data.Propagator..>>=)`.
There's not normally much reverse-flow information here, sadly, as it
typically requires us to have a way to generate an "empty candidate" a la
mempty
. It's quite hard to articulate this in a succinct way, but try
implementing the reverse flow for Defined
or Intersect
, and see what
happens.
Instances
FlatMapping Defined Eq Source # | |
FlatMapping Intersect Intersectable Source # | |
Defined in Data.JoinSemilattice.Class.FlatMapping |
class SumR x => FractionalR (x :: Type) where Source #
Reversible (fractional or floating-point) multiplication as a three-value relationship between two values and their product.
Nothing
multiplyR :: (x, x, x) -> (x, x, x) Source #
multiplyR :: Fractional x => (x, x, x) -> (x, x, x) Source #
Instances
(Eq x, Fractional x) => FractionalR (Defined x) Source # | |
(Bounded x, Enum x, Eq x, Fractional x, Hashable x) => FractionalR (Intersect x) Source # | |
class SumR x => IntegralR (x :: Type) where Source #
A four-way divMod
relationship between two values, the result of
integral division, and the result of the first modulo the second.
class (forall x. c x => Merge (f x)) => Mapping (f :: Type -> Type) (c :: Type -> Constraint) | f -> c where Source #
Lift a relationship between two values over some type constructor. Typically, this type constructor will be the parameter type.
Nothing
mapR :: (c x, c y) => ((x, y) -> (x, y)) -> (f x, f y) -> (f x, f y) Source #
mapR :: Applicative f => ((x, y) -> (x, y)) -> (f x, f y) -> (f x, f y) Source #
Instances
Mapping Defined Eq Source # | |
Mapping Intersect Intersectable Source # | |
Defined in Data.JoinSemilattice.Class.Mapping mapR :: (Intersectable x, Intersectable y) => ((x, y) -> (x, y)) -> (Intersect x, Intersect y) -> (Intersect x, Intersect y) Source # |
class EqR x b => OrdR (x :: Type) (b :: Type) | x -> b where Source #
Comparison relationships between two values and their comparison result.
lteR :: (x, x, b) -> (x, x, b) Source #
A relationship between two values and whether the left is less than or equal to the right.
ltR :: OrdR x b => (x, x, b) -> (x, x, b) Source #
Comparison between two values and their '(<)' result.
gtR :: OrdR x b => (x, x, b) -> (x, x, b) Source #
Comparison between two values and their '(>)' result.
gteR :: OrdR x b => (x, x, b) -> (x, x, b) Source #
Comparison between two values and their '(>=)' result.
class Merge x => SumR (x :: Type) where Source #
A relationship between two values and their sum.
Nothing
negateR :: (Num x, SumR x) => (x, x) -> (x, x) Source #
A relationship between a value and its negation.
subR :: SumR x => (x, x, x) -> (x, x, x) Source #
A relationship between two values and their difference.
class Mapping f c => Zipping (f :: Type -> Type) (c :: Type -> Constraint) | f -> c where Source #
Lift a relationship between three values over some f
(usually a
parameter type).
Nothing
zipWithR :: (c x, c y, c z) => ((x, y, z) -> (x, y, z)) -> (f x, f y, f z) -> (f x, f y, f z) Source #
zipWithR :: Applicative f => ((x, y, z) -> (x, y, z)) -> (f x, f y, f z) -> (f x, f y, f z) Source #
Instances
Zipping Defined Eq Source # | |
Zipping Intersect Intersectable Source # | |
Defined in Data.JoinSemilattice.Class.Zipping zipWithR :: (Intersectable x, Intersectable y, Intersectable z) => ((x, y, z) -> (x, y, z)) -> (Intersect x, Intersect y, Intersect z) -> (Intersect x, Intersect y, Intersect z) Source # |
class Monoid x => Merge (x :: Type) where Source #
Join semilattice '(<>)' specialised for propagator network needs. Allows types to implement the notion of "knowledge combination".
(<<-) :: x -> x -> Result x Source #
Merge the news (right) into the current value (left), returning an instruction on how to update the network.
data Result (x :: Type) Source #
The result of merging some news into a cell's current knowledge.
Unchanged | We've learnt nothing; no updates elsewhere are needed. |
Changed x | We've learnt something; fire the propagators! |
Failure | We've hit a failure state; discard the computation. |
data Defined (x :: Type) Source #
Defines simple "levels of knowledge" about a value.
Unknown | Nothing has told me what this value is. |
Exactly x | Everyone who has told me this value agrees. |
Conflict | Two sources disagree on what this value should be. |
Instances
newtype Intersect (x :: Type) Source #
A set type with intersection as the '(<>)' operation.
Instances
using :: (Applicative m, Intersectable x) => [Intersect x] -> Config m (Intersect x) Source #
data Prop (m :: Type -> Type) (content :: Type) Source #
A propagator network with a "focus" on a particular cell. The focus is the cell that typically holds the result we're trying to compute.
(.>>=) :: (FlatMapping f c, c x, c y) => Prop m (f x) -> (x -> f y) -> Prop m (f y) Source #
Produce a network in which the raw values of a given network are used to produce new parameter types. See the "wave function collapse" demo for an example usage.
zipWith' :: (Zipping f c, c x, c y, c z) => ((x, y, z) -> (x, y, z)) -> Prop m (f x) -> Prop m (f y) -> Prop m (f z) Source #
Lift a three-way relationship over two propagator networks' foci to produce a third propagator network with a focus on the third value in the relationship.
... It's liftA2
for propagators.
(.&&) :: BooleanR b => Prop m b -> Prop m b -> Prop m b infixr 3 Source #
Different parameter types come with different representations for Bool
.
This function takes two propagator networks focusing on boolean values, and
produces a new network in which the focus is the conjunction of the two
values.
It's a lot of words, but the intuition is, "'(&&)' over propagators".
all' :: (BooleanR b, MonadCell m) => (x -> Prop m b) -> [x] -> Prop m b Source #
Run a predicate on all values in a list, producing a list of propagator networks focusing on boolean values. Then, produce a new network with a focus on the conjunction of all these values.
In other words, "all
over propagators".
allWithIndex' :: (BooleanR b, MonadCell m) => (Int -> x -> Prop m b) -> [x] -> Prop m b Source #
The same as the all'
function, but with access to the index of the
element within the array. Typically, this is useful when trying to relate
each element to other elements within the array.
For example, cells "surrounding" the current cell in a conceptual "board".
and' :: (BooleanR b, MonadCell m) => [Prop m b] -> Prop m b Source #
Given a list of propagator networks with a focus on boolean values, create a new network with a focus on the conjugation of all these values.
In other words, "and
over propagators".
(.||) :: BooleanR b => Prop m b -> Prop m b -> Prop m b infixr 2 Source #
Calculate the disjunction of two boolean propagator network values.
any' :: (BooleanR b, MonadCell m) => (x -> Prop m b) -> [x] -> Prop m b Source #
Run a predicate on all values in a list, producing a list of propagator networks focusing on boolean values. Then, produce a new network with a focus on the disjunction of all these values.
In other words, "any
over propagators".
anyWithIndex' :: (BooleanR b, MonadCell m) => (Int -> x -> Prop m b) -> [x] -> Prop m b Source #
The same as the any'
function, but with access to the index of the
element within the array. Typically, this is useful when trying to relate
each element to other elements within the array.
For example, cells "surrounding" the current cell in a conceptual "board".
or' :: (BooleanR b, MonadCell m) => [Prop m b] -> Prop m b Source #
Given a list of propagator networks with a focus on boolean values, create a new network with a focus on the disjunction of all these values.
In other words, "or
over propagators".
not' :: (BooleanR b, MonadCell m) => Prop m b -> Prop m b Source #
Given a propagator network with a focus on a boolean value, produce a network with a focus on its negation.
... It's "not
over propagators".
false :: (BooleanR b, MonadCell m) => Prop m b Source #
Different parameter types come with different representations for Bool
.
This value is a propagator network with a focus on a polymorphic "falsey"
value.
true :: (BooleanR b, MonadCell m) => Prop m b Source #
Different parameter types come with different representations for Bool
.
This value is a propagator network with a focus on a polymorphic "truthy"
value.
(.*) :: (FractionalR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x infixl 7 Source #
Given two propagator networks, produce a new network that focuses on the product of the two given networks' foci.
... It's '(*)' lifted over propagator networks. The reverse information flow is fractional division, '(/)'.
(./) :: (FractionalR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x infixl 7 Source #
Given two propagator networks, produce a new network that focuses on the division of the two given networks' foci.
... It's '(/)' lifted over propagator networks.
(.+) :: (SumR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x infixl 6 Source #
Given two propagator networks, produce a new network that focuses on the sum of the two given networks' foci.
... It's '(+)' lifted over propagator networks.
(.-) :: (SumR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x infixl 6 Source #
Given two propagator networks, produce a new network that focuses on the difference between the two given networks' foci.
... It's '(-)' lifted over propagator networks.
(.<) :: (OrdR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b infix 4 Source #
Given two propagator networks, produce a new network that calculates whether the first network's focus be less than the second.
In other words, "it's '(<)' for propagators".
(.<=) :: (OrdR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b infix 4 Source #
Given two propagator networks, produce a new network that calculates whether the first network's focus be less than or equal to the second.
In other words, "it's '(<=)' for propagators".
(.>) :: (OrdR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b infix 4 Source #
Given two propagator networks, produce a new network that calculates whether the first network's focus be greater than the second.
In other words, "it's '(>)' for propagators".
(.>=) :: (OrdR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b infix 4 Source #
Given two propagator networks, produce a new network that calculates whether the first network's focus be greater than or equal to the second.
In other words, "it's '(>=)' for propagators".
(.==) :: (EqR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b infix 4 Source #
Given two propagator networks, produce a new propagator network with the result of testing the two for equality.
In other words, "it's '(==)' for propagators".
(./=) :: (EqR x b, MonadCell m) => Prop m x -> Prop m x -> Prop m b infix 4 Source #
Given two propagator networks, produce a new propagator network with the result of testing the two for inequality.
In other words, "it's '(/=)' for propagators".
distinct :: (EqR x b, MonadCell m) => [Prop m x] -> Prop m b Source #
Given a list of networks, produce the conjunction of '(./=)' applied to every possible pair. The resulting network's focus is the answer to whether every propagator network's focus is different to the others.
Are all the values in this list distinct?
(.%.) :: (IntegralR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x infixl 7 Source #
Given two propagator networks, produce a new network that focuses on the modulo of the two given networks' integral foci.
... It's mod
lifted over propagator networks.
(.*.) :: (Num x, IntegralR x) => Prop m x -> Prop m x -> Prop m x infixl 7 Source #
Given two propagator networks, produce a new network that focuses on the product between the two given networks' integral foci.
... It's '(*)' lifted over propagator networks. Crucially, the reverse
information flow uses integral division, which should work the same way
as div
.
(./.) :: (IntegralR x, MonadCell m) => Prop m x -> Prop m x -> Prop m x infixl 7 Source #
Given two propagator networks, produce a new network that focuses on the division of the two given networks' integral foci.
... It's div
lifted over propagator networks.
abs' :: (AbsR x, MonadCell m) => Prop m x -> Prop m x Source #
Produce a network that focuses on the absolute value of another network's focus.
... It's abs
lifted over propagator networks.