zstd-0.1.1.2: Haskell bindings to the Zstandard compression algorithm

Copyright(c) 2016-present Facebook Inc. All rights reserved.
LicenseBSD3
Maintainerbryano@fb.com
Stabilityexperimental
PortabilityGHC
Safe HaskellNone
LanguageHaskell2010

Codec.Compression.Zstd.Efficient

Contents

Description

These functions allow for pre-allocation and reuse of relatively expensive data structures, such as compression and decompression contexts and dictionaries.

If your application mostly deals with small payloads and is particularly sensitive to latency or throughput, using these pre-allocated structures may make a noticeable difference to performance.

Synopsis

Basic entry points

data Decompress Source #

The result of a decompression operation.

Constructors

Skip

Either the compressed frame was empty, or it was compressed in streaming mode and so its size is not known.

Error String

An error occurred.

Decompress ByteString

The payload was successfully decompressed.

decompressedSize :: ByteString -> Maybe Int Source #

Return the decompressed size of a compressed payload, as stored in the payload's header.

The returned value will be Nothing if it is either not known (probably because the payload was compressed using a streaming API), empty, or too large to fit in an Int.

Note: this value should not be trusted, as it can be controlled by an attacker.

maxCLevel :: Int Source #

The maximum compression level supported by the library.

Cheaper operations using contexts

Compression

data CCtx Source #

Compression context.

withCCtx :: (CCtx -> IO a) -> IO a Source #

Allocate a compression context, run an action that may reuse the context as many times as it needs, then free the context.

compressCCtx Source #

Arguments

:: CCtx

Compression context.

-> Int

Compression level. Must be >= 1 and <= maxCLevel.

-> ByteString

Payload to compress.

-> IO ByteString 

Compress the given data as a single zstd compressed frame.

Decompression

data DCtx Source #

Decompression context.

withDCtx :: (DCtx -> IO a) -> IO a Source #

Allocate a decompression context, run an action that may reuse the context as many times as it needs, then free the context.

decompressDCtx Source #

Arguments

:: DCtx

Decompression context.

-> ByteString

Compressed payload.

-> IO Decompress 

Decompress a single-frame payload of known size. Typically this will be a payload that was compressed with compress.

Note: This function is not capable of decompressing a payload generated by the streaming or lazy compression APIs.

Dictionary-based compression

data Dict Source #

Compression dictionary.

Instances
Eq Dict Source # 
Instance details

Defined in Codec.Compression.Zstd.Types

Methods

(==) :: Dict -> Dict -> Bool #

(/=) :: Dict -> Dict -> Bool #

Ord Dict Source # 
Instance details

Defined in Codec.Compression.Zstd.Types

Methods

compare :: Dict -> Dict -> Ordering #

(<) :: Dict -> Dict -> Bool #

(<=) :: Dict -> Dict -> Bool #

(>) :: Dict -> Dict -> Bool #

(>=) :: Dict -> Dict -> Bool #

max :: Dict -> Dict -> Dict #

min :: Dict -> Dict -> Dict #

Read Dict Source # 
Instance details

Defined in Codec.Compression.Zstd.Types

Show Dict Source # 
Instance details

Defined in Codec.Compression.Zstd.Types

Methods

showsPrec :: Int -> Dict -> ShowS #

show :: Dict -> String #

showList :: [Dict] -> ShowS #

NFData Dict Source # 
Instance details

Defined in Codec.Compression.Zstd.Types

Methods

rnf :: Dict -> () #

mkDict :: ByteString -> Dict Source #

Smart constructor.

trainFromSamples Source #

Arguments

:: Int

Maximum size of the compression dictionary to create. The actual dictionary returned may be smaller.

-> [ByteString]

Samples to train with.

-> Either String Dict 

Create and train a compression dictionary from a collection of samples.

To create a well-trained dictionary, here are some useful guidelines to keep in mind:

  • A reasonable dictionary size is in the region of 100 KB. (Trying to specify a dictionary size of less than a few hundred bytes will probably fail.)
  • To train the dictionary well, it is best to supply a few thousand training samples.
  • The combined size of all training samples should be 100 or more times larger than the size of the dictionary.

getDictID :: Dict -> Maybe Word Source #

Return the identifier for the given dictionary, or Nothing if not a valid dictionary.

Basic pure API

compressUsingDict Source #

Arguments

:: Dict

Compression dictionary.

-> Int

Compression level. Must be >= 1 and <= maxCLevel.

-> ByteString

Payload to compress.

-> ByteString 

Compress the given data as a single zstd compressed frame, using a prebuilt dictionary.

decompressUsingDict Source #

Arguments

:: Dict

Dictionary.

-> ByteString

Payload to decompress.

-> Decompress 

Decompress a single-frame payload of known size, using a prebuilt dictionary. Typically this will be a payload that was compressed with compressUsingDict.

Note: This function is not capable of decompressing a payload generated by the streaming or lazy compression APIs.

Pre-digested dictionaries

data CDict Source #

A pre-digested compression dictionary.

createCDict Source #

Arguments

:: Int

Compression level.

-> Dict

Dictionary.

-> CDict 

Create a pre-digested compression dictionary.

compressUsingCDict Source #

Arguments

:: CCtx

Compression context.

-> CDict

Compression dictionary.

-> ByteString

Payload to compress.

-> IO ByteString 

Compress the given data as a single zstd compressed frame, using a pre-built, pre-digested dictionary.

data DDict Source #

A pre-digested decompression dictionary.

createDDict Source #

Arguments

:: Dict

Dictionary.

-> DDict 

Create a pre-digested compression dictionary.

decompressUsingDDict Source #

Arguments

:: DCtx

Decompression context.

-> DDict

Decompression dictionary.

-> ByteString

Payload to compress.

-> IO Decompress 

Decompress a single-frame payload of known size, using a pre-built, pre-digested dictionary. Typically this will be a payload that was compressed with compressUsingCDict.

Note: This function is not capable of decompressing a payload generated by the streaming or lazy compression APIs.