cabal-cache: CI Assistant for Haskell projects

[ bsd3, development, library, program ] [ Propose Tags ]

CI Assistant for Haskell projects. Implements package caching.

[Skip to Readme]


[Last Documentation]

  • HaskellWorks
    • CabalCache
      • AWS
        • HaskellWorks.CabalCache.AWS.Env
        • HaskellWorks.CabalCache.AWS.Error
        • HaskellWorks.CabalCache.AWS.S3
          • HaskellWorks.CabalCache.AWS.S3.URI
      • HaskellWorks.CabalCache.AppError
      • Concurrent
        • HaskellWorks.CabalCache.Concurrent.DownloadQueue
        • HaskellWorks.CabalCache.Concurrent.Fork
        • HaskellWorks.CabalCache.Concurrent.Type
      • HaskellWorks.CabalCache.Core
      • Data
        • HaskellWorks.CabalCache.Data.List
      • HaskellWorks.CabalCache.Error
      • HaskellWorks.CabalCache.GhcPkg
      • HaskellWorks.CabalCache.Hash
      • IO
        • HaskellWorks.CabalCache.IO.Console
        • HaskellWorks.CabalCache.IO.File
        • HaskellWorks.CabalCache.IO.Lazy
        • HaskellWorks.CabalCache.IO.Tar
      • HaskellWorks.CabalCache.Location
      • HaskellWorks.CabalCache.Metadata
      • HaskellWorks.CabalCache.Options
      • HaskellWorks.CabalCache.Show
      • HaskellWorks.CabalCache.Store
      • HaskellWorks.CabalCache.Text
      • HaskellWorks.CabalCache.Topology
      • HaskellWorks.CabalCache.Types
      • HaskellWorks.CabalCache.URI
      • HaskellWorks.CabalCache.Version


Maintainer's Corner

Package maintainers

For package maintainers and hackage trustees


Versions [RSS],,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, (info)
Dependencies aeson (>= && <2.2), amazonka (>=1.6.1 && <1.7), amazonka-core (>=1.6.1 && <1.7), amazonka-s3 (>=1.6.1 && <1.7), attoparsec (>=0.14 && <0.15), base (>=4.7 && <5), bytestring (>= && <0.12), cabal-cache, cabal-install-parsers (>=0.4 && <0.6), conduit-extra (>= && <1.4), containers (>= && <0.7), cryptonite (>=0.25 && <1), deepseq (>= && <1.5), directory (>= && <1.4), exceptions (>=0.10.1 && <0.11), filepath (>=1.3 && <1.5), generic-lens (>= && <2.3), http-client (>=0.5.14 && <0.8), http-client-tls (>=0.3 && <0.4), http-types (>=0.12.3 && <0.13), lens (>=4.17 && <6), mtl (>=2.2.2 && <2.4), network-uri (>= && <2.8), oops (>=0.2 && <0.3), optparse-applicative (>=0.14 && <0.18), process (>= && <1.7), relation (>=0.5 && <0.6), resourcet (>=1.2.2 && <1.4), stm (>= && <3), stringsearch (>= && <0.4), temporary (>=1.3 && <1.4), text (>= && <2.1), topograph (>=1 && <2), transformers (>= && <0.7), unliftio (>=0.2.10 && <0.3) [details]
License BSD-3-Clause
Copyright John Ky 2019-2023
Author John Ky
Category Development
Home page
Source repo head: git clone
Uploaded by haskellworks at 2023-02-05T07:18:38Z
Executables cabal-cache
Downloads 10835 total (109 in the last 30 days)
Rating 2.0 (votes: 1) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs not available [build log]
All reported builds failed as of 2023-02-05 [all 2 reports]

Readme for cabal-cache-

[back to package description]



Tool for caching built cabal new-build packages.

The tool is useful in development when you want to share your build haskell package dependencies of of a particular project with another developer and also in CI where caching is useful for reducing build times.

cabal-cache supports syncing to an archive directory or to an S3 bucket.


Several installation methods are available.

From source

cabal new-install cabal-cache


Dowload binaries from

Using Homebrew on Mac OS X

brew tap haskell-works/homebrew-haskell-works
brew update
brew install cabal-cache

Example usage

Syncing built packages with S3 requires you have an S3 bucket with AWS credentials stored in the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environent variables. You should also know the AWS region the bucket was created in.

Sync to archive

Change into your project directory.

Build the project with cabal v2-build. This will ensure your dependencies are built and will produce a plan.json file that is required for the cabal-cache tool to know which built packages to sync up.

Run the following command to sync to S3.

cabal-cache sync-to-archive --threads 16 --archive-uri s3://my-cabal-cache-bucket/archive --region Sydney

Run the following command to sync to archive directory.

cabal-cache sync-to-archive --threads 16 --archive-uri archive --region Sydney

Sync from S3

Change into your project directory.

Build the project with cabal v2-configure. This will product a plan.json file that is required for the cabal-cache tool to know which built packages to sync down.

Run the following command to sync from S3.

cabal-cache sync-from-archive --threads 16 --archive-uri s3://my-cabal-cache-bucket/archive --region Sydney

Run the following command to sync from archive directory.

cabal-cache sync-from-archive --threads 16 --archive-uri archive --region Sydney


To run against a different service, use something like:

cabal-cache sync-to-archive --threads 16 --archive-uri s3://my-cabal-cache-bucket/archive --host-port-override=443 --host-ssl-override=True

The archive

Archive tarball format

Built packages are stored in tarballs which contain the following files:

x ${compiler_id}/${package_id}/_CC_METADATA/store-path
x ${compiler_id}/lib/libHS${package_id}-*.dylib
x ${compiler_id}/${package_id}
x ${compiler_id}/package.db/${package_id}.conf

Aside from the files in the _CC_METADATA directory, everything else is copied verbatim from cabal store from the corresponding location. This includes the conf file which may contain absolute paths that would cause the built package to be non-relocatable.

As a work-around, the tarball also inclues the _CC_METADATA/store-path file which stores the cabal store path from which the cached package was derived.

Upon unpacking, cabal-cache will rewrite the conf file to contain the new store path using the information store in the _CC_METADATA/store-path file. _CC_METADATA directory and its contents will be additionally unpacked making it easy to recognise packages that have been restored using cabal-cache.

Archive directory structure

The archive contains files in the following locations:


Both tarballs are identical. If they both exist then the first may be a symlink to the second when store on the filesystem.

The direct subdirectories of the archive is the ${archive_verson}, for example v2. This is the version of the archive format. This corresponds to the major version of the cabal-cache package.

The next directory may be the ${store_hash} or the ${compiler_id}. If it is the ${store_hash} then the ${compiler_id} will be a subdirectory of that.

The ${store_hash} is the hash of the store path from which the cached package originally came.

cabal-cache will preferentially restore using this version if it is available and the ${store_hash} matches the cabal store path that is being restore to.

If the package matching the ${store_hash} cannot be found, cabal-cache will fallback to the version without the ${store_hash}.

A version without a ${store-hash} may not exist. See Caveats for more information.


Packages that use absolute paths to the cabal store

Packages sometimes do things that cause their built artefacts to contain absolute paths to the cabal store. This unfortunately makes such built packages non-relocatable.

It is recommended that you use a fixed cabal store path rather than the default $HOME/.cabal/store to avoid any potential issues.

See for more information.

Following are examples of how this might happen:


Paths_$pkgname modules have embedded within them the absolute path to the package in the cabal store which means that packages that use some features of this module are not relocatable depending on what they do.

Packages may query this module to get access to the package's cabal store share directory which contains data files that the package can read at runtime. Using cabal-cache for such packages could mean that the package will be unable to find such data files.

To protect against this, cabal-cache will by default not sync packages down from the archive if the package's cabal store share directory contain unusual files or directories unless the ${store_hash} matches. Currently it only considers the doc subdirectory to be usual. More exceptions may be added later.