hasktorch-indef-0.0.1.0: Core Hasktorch abstractions wrapping FFI bindings

Copyright(c) Sam Stites 2017
LicenseBSD3
Maintainersam@stites.io
Stabilityexperimental
Portabilitynon-portable
Safe HaskellNone
LanguageHaskell2010

Torch.Indef.Dynamic.Tensor

Contents

Description

This package is the class for handling numeric data in dynamic tensors.

A Dynamic is a multi-dimensional matrix without static type-level dimensions. The number of dimensions is unlimited (up to what can be created using LongStorage).

Synopsis

Documentation

_clearFlag :: Dynamic -> Int8 -> IO () Source #

Clears the internal flags on a tensor. Uses bitwise operators for flags.

tensordata :: Dynamic -> [HsReal] Source #

Get the underlying data as a haskell list from the tensor

NOTE: This _cannot_ use a Tensor's storage size because ATen's Storage allocates up to the next 64-byte line on the CPU (needs reference, this is the unofficial response from @soumith in slack).

get1d :: Dynamic -> Word -> Maybe HsReal Source #

get a value from dimension 1

get2d :: Dynamic -> Word -> Word -> Maybe HsReal Source #

get a value from dimension 2

get3d :: Dynamic -> Word -> Word -> Word -> Maybe HsReal Source #

get a value from dimension 3

get4d :: Dynamic -> Word -> Word -> Word -> Word -> Maybe HsReal Source #

get a value from dimension 4

getDim :: Dynamic -> Dims (i :+ ds :: [Nat]) -> Maybe HsReal Source #

get a value from a dynamic tensor at a given index, Dims d.

isContiguous :: Dynamic -> Bool Source #

whether or not the tensor is contiguous in memory.

isSameSizeAs :: Dynamic -> Dynamic -> Bool Source #

check to see if to tensors are the same size as eachother.

isSetTo :: Dynamic -> Dynamic -> Bool Source #

Returns true iff the Tensor is set to the argument Tensor.

Note: this is only true if the tensors are the same size, have the same strides and share the same storage and offset.

isSize :: Dynamic -> LongStorage -> Bool Source #

check to see if the tensor is the same size as the LongStorage.

nDimension :: Dynamic -> Word Source #

Returns the number of dimensions in a Tensor.

nElement :: Dynamic -> Word64 Source #

Returns the number of elements in a Tensor.

_narrow Source #

Arguments

:: Dynamic

return tensor to mutate (C-style)

-> Dynamic

source tensor used for data

-> Word

dimension to operate on

-> Int64 
-> Size 
-> IO () 

Warning: hasktorch devs have not yet made this safe. You are warned.

returns a tensor which shares the same Storage as the original. Hence, any modification in the memory of the sub-tensor will have an impact on the primary tensor, and vice-versa. These methods are very fast, as they do not involve any memory copy.

empty :: Dynamic Source #

Returns an empty tensor.

_expand Source #

Arguments

:: Dynamic

return tensor to mutate inplace.

-> Dynamic

source tensor to expand

-> IndexStorage

how to expand the tensor.

-> IO () 

Expanding a tensor does not allocate new memory, but only creates a new view on the existing tensor where singleton dimensions can be expanded to multiple ones by setting the stride to 0. Any dimension that has size 1 can be expanded to arbitrary value without any new memory allocation. Attempting to expand along a dimension that does not have size 1 will result in an error which we do not currently handle in hasktorch.

_expandNd :: NonEmpty Dynamic -> NonEmpty Dynamic -> Int -> IO () Source #

FIXME: doublecheck what this does.

newClone :: Dynamic -> Dynamic Source #

purely clone a tensor

newContiguous :: Dynamic -> Dynamic Source #

purely clone a tensor to have a contiguous memory layout.

newNarrow Source #

Arguments

:: Dynamic

source tensor

-> Word

dimenion to operate over

-> Int64 
-> Size 
-> IO Dynamic

return tensor, linked by storage to source tensor

Warning: hasktorch devs have not yet made this safe. You are warned.

returns a tensor which shares the same Storage as the original. Hence, any modification in the memory of the sub-tensor will have an impact on the primary tensor, and vice-versa. These methods are very fast, as they do not involve any memory copy.

newSelect Source #

Arguments

:: Dynamic

source tensor

-> Word

dimension to operate over

-> Int64 
-> IO Dynamic

return tensor, linked by storage to source tensor

Warning: hasktorch devs have not yet made this safe. You are warned.

returns a tensor which shares the same Storage as the original. Hence, any modification in the memory of the sub-tensor will have an impact on the primary tensor, and vice-versa. These methods are very fast, as they do not involve any memory copy.

newSizeOf :: Dynamic -> IndexStorage Source #

get the sizes of each dimension

FIXME: doublecheck this

newStrideOf :: Dynamic -> IndexStorage Source #

get the strides of each dimension

FIXME: doublecheck this

newTranspose :: Dynamic -> Word -> Word -> Dynamic Source #

pure version of _transpose

newUnfold Source #

Arguments

:: Dynamic

source tensor

-> Word

dimension to operate on

-> Int64 
-> Int64 
-> Dynamic

return tensor

pure version of _unfold

newView :: Dynamic -> IndexStorage -> IO Dynamic Source #

Warning: hasktorch devs have not yet made this safe. You are warned.

Creates a view with different dimensions of the storage associated with tensor, returning a new tensor.

FIXME: I think resizeAs is the non-cloning version of this function. See: https://github.com/torch/torch7/blob/master/doc/tensor.md#result-viewresult-tensor-sizes

for more.

NOTE(stites): I think this API can only be kept pure via linear types.

newWithSize :: IndexStorage -> IndexStorage -> Dynamic Source #

create an uninitialized tensor with the given size and strides (?)

FIXME: doublecheck what the IndexStorages stands for

newWithSize1d :: Word -> Dynamic Source #

create an uninitialized 1d tensor

newWithSize2d :: Word -> Word -> Dynamic Source #

create an uninitialized 2d tensor

newWithSize3d :: Word -> Word -> Word -> Dynamic Source #

create an uninitialized 3d tensor

newWithSize4d :: Word -> Word -> Word -> Word -> Dynamic Source #

create an uninitialized 4d tensor

newWithStorage :: Storage -> StorageOffset -> IndexStorage -> IndexStorage -> Dynamic Source #

create a new tensor with the given size and strides, storage offset and storage.

newWithStorage1d :: Storage -> StorageOffset -> (Size, Stride) -> Dynamic Source #

create a new 1d tensor with the given storage's first dimension.

newWithStorage2d :: Storage -> StorageOffset -> (Size, Stride) -> (Size, Stride) -> Dynamic Source #

create a new 2d tensor with the given storage's first 2 dimensions.

newWithStorage3d :: Storage -> StorageOffset -> (Size, Stride) -> (Size, Stride) -> (Size, Stride) -> Dynamic Source #

create a new 3d tensor with the given storage's first 3 dimensions.

newWithStorage4d :: Storage -> StorageOffset -> (Size, Stride) -> (Size, Stride) -> (Size, Stride) -> (Size, Stride) -> Dynamic Source #

create a new 4d tensor with the given storage's first 4 dimensions.

newWithTensor :: Dynamic -> IO Dynamic Source #

Warning: this function causes the input tensor to be impure

create a new tensor with the given tensor's underlying storage.

_resize :: Dynamic -> IndexStorage -> IndexStorage -> IO () Source #

Resize the tensor according to the given LongStorage size (and strides?) FIXME: doublecheck what the IndexStorages stands for

resize1d_ :: Dynamic -> Word -> IO () Source #

resize dimension 1 of a tensor.

resize2d_ :: Dynamic -> Word -> Word -> IO () Source #

resize the first 2 dimensions of a tensor.

resize3d_ :: Dynamic -> Word -> Word -> Word -> IO () Source #

resize the first 3 dimensions of a tensor.

resize4d_ :: Dynamic -> Word -> Word -> Word -> Word -> IO () Source #

resize the first 4 dimensions of a tensor.

resize5d_ :: Dynamic -> Word -> Word -> Word -> Word -> Word -> IO () Source #

resize the first 5 dimensions of a tensor.

resizeAs_ Source #

Arguments

:: Dynamic

tensor to mutate inplace

-> Dynamic

tensor used for its shape

-> IO () 

Resize the tensor as the given tensor.

resizeNd_ Source #

Arguments

:: Dynamic

tensor to resize inplace.

-> Int32

unknown argument. FIXME: Someone needs to find this out.

-> [Size]

new sizes to update

-> [Stride]

new strides to update.

-> IO () 

resize a tensor with given strides, sizes and a magical parameter.

FIXME: Someone needs to find out what the magical parameter is.

retain :: Dynamic -> IO () Source #

Increment the reference counter of the tensor.

From: https://github.com/torch/torch7/blob/aed31711c6b8846b8337a263a7f9f998697994e7/doc/tensor.md#reference-counting

Tensors are reference-counted. It means that each time an object (C or the Lua state) need to keep a reference over a tensor, the corresponding tensor reference counter will be increased. The reference counter is decreased when the object does not need the tensor anymore.

These methods should be used with extreme care. In general, they should never be called, except if you know what you are doing, as the handling of references is done automatically. They can be useful in threaded environments. Note that these methods are atomic operations.

_select Source #

Arguments

:: Dynamic

return tensor which is mutated inplace (C-Style)

-> Dynamic

source tensor

-> Word

dimension to operate over

-> Word 
-> IO () 

Warning: hasktorch devs have not yet made this safe. You are warned.

returns a tensor which shares the same Storage as the original. Hence, any modification in the memory of the sub-tensor will have an impact on the primary tensor, and vice-versa. These methods are very fast, as they do not involve any memory copy.

_set Source #

Arguments

:: Dynamic

the source tensor which is mutated inplace

-> Dynamic

the tensor who's storage is going to be referenced.

-> IO () 

Warning: hasktorch devs have not yet made this safe. You are warned.

set the source tensor's storage to another tensor.

set1d_ Source #

Arguments

:: Dynamic

source tensor

-> Word

rank-1 index

-> HsReal

value to put

-> IO () 

set a value in dimension 1, inplace.

set2d_ Source #

Arguments

:: Dynamic

source tensor

-> Word

rank-1 index

-> Word

rank-2 index

-> HsReal

value to put

-> IO () 

set a value in dimension 2, inplace.

set3d_ Source #

Arguments

:: Dynamic

source tensor

-> Word

rank-1 index

-> Word

rank-2 index

-> Word

rank-3 index

-> HsReal

value to put

-> IO () 

set a value in dimension 3, inplace.

set4d_ Source #

Arguments

:: Dynamic

source tensor

-> Word

rank-1 index

-> Word

rank-2 index

-> Word

rank-3 index

-> Word

rank-4 index

-> HsReal

value to put

-> IO () 

set a value in dimension 4, inplace.

setFlag_ :: Dynamic -> Int8 -> IO () Source #

set the flags on a tensor inplace

setStorage_ :: Dynamic -> Storage -> StorageOffset -> IndexStorage -> IndexStorage -> IO () Source #

Warning: mutating a tensor's storage can make your program unsafe. You are warned.

Set the storage of a tensor.

FIXME: doublecheck what the IndexStorages stands for

setStorage1d_ :: Dynamic -> Storage -> StorageOffset -> (Size, Stride) -> IO () Source #

Warning: mutating a tensor's storage can make your program unsafe. You are warned.

Set the storage of a tensor, only referencing 1 dimension of storage

setStorage2d_ :: Dynamic -> Storage -> StorageOffset -> (Size, Stride) -> (Size, Stride) -> IO () Source #

Warning: mutating a tensor's storage can make your program unsafe. You are warned.

Set the storage of a tensor, only referencing 2 dimensions of storage

setStorage3d_ :: Dynamic -> Storage -> StorageOffset -> (Size, Stride) -> (Size, Stride) -> (Size, Stride) -> IO () Source #

Warning: mutating a tensor's storage can make your program unsafe. You are warned.

Set the storage of a tensor, only referencing 3 dimensions of storage

setStorage4d_ :: Dynamic -> Storage -> StorageOffset -> (Size, Stride) -> (Size, Stride) -> (Size, Stride) -> (Size, Stride) -> IO () Source #

Warning: mutating a tensor's storage can make your program unsafe. You are warned.

Set the storage of a tensor, only referencing 4 dimensions of storage

setStorageNd_ Source #

Arguments

:: Dynamic

tensor to mutate, inplace

-> Storage

storage to set

-> StorageOffset

offset of the storage to start from

-> Word

dimension... to operate over? to start from? (TODO: allow for "unset" dimension)

-> [Size]

sizes to use with the storage

-> [Stride]

strides to use with the storage

-> IO () 

Warning: mutating a tensor's storage can make your program unsafe. You are warned.

Set the storage of a tensor, referencing any number of dimensions of storage

size Source #

Arguments

:: Dynamic

tensor to inspect

-> Word

dimension to get

-> Word

size of the dimension

get the size of a tensor's specific dimension.

FIXME: this can throw an exception if the dimension is out-of-bound.

sizeDesc :: Dynamic -> IO DescBuff Source #

primarily used for debugging. Get the size description from a c call.

_squeeze :: Dynamic -> Dynamic -> IO () Source #

Removes all singleton dimensions of the tensor.

squeeze1d_ Source #

Arguments

:: Dynamic

tensor to mutate

-> Word

dimension to squeeze

-> IO () 

Removes a singleton dimensions of the tensor at a given dimension.

_squeeze1d Source #

Arguments

:: Dynamic

tensor to mutate as return (C-Style)

-> Dynamic

source tensor

-> Word

dimension to squeeze

-> IO () 

Removes a singleton dimensions of the tensor at a given dimension.

storage :: Dynamic -> Storage Source #

Warning: extracting and using a tensor's storage can make your program unsafe. You are warned.

get the underlying storage of a tensor

storageOffset :: Dynamic -> StorageOffset Source #

get the storage offset of a tensor

stride Source #

Arguments

:: Dynamic

tensor to query

-> Word

dimension of tensor

-> IO Stride

stride of dimension

Returns the jump necessary to go from one element to the next one in the specified dimension dim.

_transpose Source #

Arguments

:: Dynamic

tensor to mutate into the result.

-> Dynamic

source tensor to use for data.

-> Word

dim1

-> Word

dim2

-> IO () 

Returns a tensor where dimensions dim1 and dim2 have been swapped.

_unfold Source #

Arguments

:: Dynamic

tensor to mutate into the result.

-> Dynamic

source tensor to use for data.

-> Word

dimension to operate on

-> Size 
-> Step 
-> IO () 

Returns a tensor which contains all slices of size size in the dimension dim. Step between two slices is given by step.

If sizedim is the original size of dimension dim, the size of dimension dim in the returned tensor will be (sizedim - size) / step + 1

An additional dimension of size size is appended in the returned tensor.

FIXME: this still takes C-like arguments which mutates the first arg inplace.

unsqueeze1d_ Source #

Arguments

:: Dynamic

tensor to mutate

-> Word

dimension to unsqueeze

-> IO () 

unsqueeze a tensor inplace

_unsqueeze1d Source #

Arguments

:: Dynamic

tensor to mutate into the result.

-> Dynamic

source tensor to use for data.

-> Word

dimension to unsqueeze

-> IO () 

unsqueeze a tensor, adding a singleton dimension at the specified dimval.

shape :: Dynamic -> [Word] Source #

return the a runtime shape representing the dimensions of a Dynamic

setStorageDim_ :: Dynamic -> Storage -> StorageOffset -> [(Size, Stride)] -> IO () Source #

Warning: mutating a tensor's storage can make your program unsafe. You are warned.

set the storage dimensionality of a dynamic tensor, inplace, to any new size and stride pair.

setDim_ :: Dynamic -> Dims (d :: [Nat]) -> HsReal -> IO () Source #

set a value of a dynamic tensor, inplace, with any dimensionality.

resizeDim_ :: Dynamic -> Dims (d :: [Nat]) -> IO () Source #

resize a dynamic tensor, inplace, to any new dimensionality

vectorEIO :: [HsReal] -> ExceptT String IO Dynamic Source #

create a 1d Dynamic tensor from a list of elements.

FIXME construct this with TH, not by using setDim inplace (one-by-one) which might be doing a second linear pass. FIXME: CUDA doesn't like the storage allocation:

matrix :: [[HsReal]] -> ExceptT String IO Dynamic Source #

create a 2d Dynamic tensor from a list of list of elements.

cuboid :: [[[HsReal]]] -> ExceptT String IO Dynamic Source #

create a 3d Dynamic tensor (ie: rectangular cuboid) from a nested list of elements.

hyper :: [[[[HsReal]]]] -> ExceptT String IO Dynamic Source #

create a 4d Dynamic tensor (ie: hyperrectangle) from a nested list of elements.

getDimsList :: Integral i => Dynamic -> [i] Source #

get the runtime dimension list of a dynamic tensor

getSomeDims :: Dynamic -> SomeDims Source #

alias to getDimList which wraps the dimensions list in a SomeDims

new :: Dims (d :: [Nat]) -> Dynamic Source #

create a new dynamic tensor of size Dims d

setDim'_ :: Dynamic -> SomeDims -> HsReal -> IO () Source #

set a specific runtime SomeDims position of a dynamic tensor.

resizeDim'_ :: Dynamic -> SomeDims -> IO () Source #

resize a dynamic tensor inplace with runtime SomeDims representation of its new shape.

new' :: SomeDims -> Dynamic Source #

build a new tensor with a runtime SomeDims

resizeAs Source #

Arguments

:: Dynamic

src tensor to mutate

-> Dynamic

a tensor only used for its shape

-> IO Dynamic

a new copy of src with the shape tensor's shape

resize a tensor to take the shape of the second Dynamic argument. This is a pure function.

Helper functions

withInplace :: (Dynamic -> IO ()) -> Dims (d :: [Nat]) -> IO Dynamic Source #

run a function with a dynamic tensor and storage's underlying implementation details. withDynamicStateAndStorage :: Sig.Dynamic -> Sig.Storage -> (Ptr Sig.CState -> Ptr Sig.CTensor -> Ptr Sig.CStorage -> IO x) -> IO x withDynamicStateAndStorage t s fn = flip with pure $ do s' <- managedState t' <- managedTensor t liftIO $ withForeignPtr (Sig.cstorage s) (fn s' t')

exported helper function. Not actually "inplace" this is actually "with return and static dimensions"

withInplace' :: (Dynamic -> IO ()) -> SomeDims -> IO Dynamic Source #

exported helper function. not actually "inplace" this is actually "with return and runtime dimensions"

twice :: Dynamic -> (Dynamic -> Dynamic -> IO ()) -> IO Dynamic Source #

exported helper function. This is actually inplace

withEmpty' :: (Dynamic -> IO ()) -> IO Dynamic Source #

exported helper function. Should be renamed to newFromSize withEmpty :: Dynamic -> (Dynamic -> IO ()) -> IO Dynamic withEmpty t op = let r = new' (getSomeDimsList t) in op r >> pure r

exported helper function. We can get away with this some of the time, when Torch does the resizing in C, but you need to look at the c implementation

Orphan instances

IsList Dynamic Source # 
Instance details

Associated Types

type Item Dynamic :: Type #

Show Dynamic Source # 
Instance details