csv-conduit: A flexible, fast, conduit-based CSV parser library for Haskell.

[ bsd3, conduit, csv, data, library, text ] [ Propose Tags ] [ Report a vulnerability ]

CSV files are the de-facto standard in many situations involving data transfer, particularly when dealing with enterprise application or disparate database systems.

While there are a number of CSV libraries in Haskell, at the time of this project's start in 2010, there wasn't one that provided all of the following:

  • Full flexibility in quote characters, separators, input/output

  • Constant space operation

  • Robust parsing, correctness and error resiliency

  • Convenient interface that supports a variety of use cases

  • Fast operation

This library is an attempt to close these gaps. Please note that this library started its life based on the enumerator package and has recently been ported to work with conduits instead. In the process, it has been greatly simplified thanks to the modular nature of the conduits library.

Following the port to conduits, the library has also gained the ability to parameterize on the stream type and work both with ByteString and Text.

For more documentation and examples, check out the README at:

http://github.com/dmvianna/csv-conduit


[Skip to Readme]

Flags

Manual Flags

NameDescriptionDefault
lib-werrorDisabled

Use -f <flag> to enable a flag, or -f -<flag> to disable that flag. More info

Downloads

Maintainer's Corner

Package maintainers

For package maintainers and hackage trustees

Candidates

  • No Candidates
Versions [RSS] 0.1, 0.2, 0.2.1.1, 0.3, 0.3.0.1, 0.3.0.2, 0.3.0.3, 0.4.1, 0.5.0, 0.5.1, 0.6.2, 0.6.2.1, 0.6.3, 0.6.5, 0.6.6, 0.6.7, 0.6.8, 0.6.8.1, 0.7.0.0, 0.7.1.0, 0.7.2.0, 0.7.3.0, 1.0.0.1, 1.0.1.0
Change log changelog.md
Dependencies array, attoparsec (>=0.10), base (>=4 && <5), blaze-builder, bytestring, conduit (>=1.3.0 && <2.0), conduit-extra, containers (>=0.3), data-default, exceptions (>=0.3), ordered-containers, primitive, resourcet (>=1.1.2.1), text, transformers, vector [details]
Tested with ghc ==8.2.2 || ==8.4.4 || ==8.6.5 || ==8.8.3 || ==8.8.4 || ==8.10.4 || ==9.0.1 || ==9.6.4
License BSD-3-Clause
Author Ozgun Ataman
Maintainer Daniel Vianna <dmvianna@gmail.com>
Category Data, Conduit, CSV, Text
Home page http://github.com/dmvianna/csv-conduit
Source repo head: git clone git://github.com/ozataman/csv-conduit.git
Uploaded by dmvianna at 2024-10-10T00:52:24Z
Distributions Debian:0.7.1.0, Stackage:1.0.1.0
Reverse Dependencies 5 direct, 1 indirect [details]
Downloads 20919 total (121 in the last 30 days)
Rating (no votes yet) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs uploaded by user
Build status unknown [no reports yet]

Readme for csv-conduit-1.0.1.0

[back to package description]

README

cabalbuild stack build

CSV Files and Haskell

CSV files are the de-facto standard in many cases of data transfer, particularly when dealing with enterprise application or disparate database systems.

While there are a number of csv libraries in Haskell, at the time of this project's start, there wasn't one that provided all of the following:

  • Full flexibility in quote characters, separators, input/output
  • Constant space operation
  • Robust parsing and error resiliency
  • Battle-tested reliability in real-world datasets
  • Fast operation
  • Convenient interface that supports a variety of use cases

Over time, people created other plausible CSV packages like cassava. The major benefit from this library remains to be:

  • Direct participation in the conduit ecosystem, which is now quite large, and all the benefits that come with it.
  • Flexibility in CSV format definition.
  • Resiliency to errors in the input data.

This package

csv-conduit is a conduit-based CSV parsing library that is easy to use, flexible and fast. It leverages the conduit infrastructure to provide constant-space operation, which is quite critical in many real world use cases.

For example, you can use http-conduit to download a CSV file from the internet and plug its Source into intoCSV to stream-convert the download into the Row data type and do something with it as the data streams, that is without having to download the entire file to disk first.

Author & Contributors

  • Ozgun Ataman (@ozataman)
  • Daniel Bergey (@bergey)
  • BJTerry (@BJTerry)
  • Mike Craig (@mkscrg)
  • Daniel Corson (@dancor)
  • Dmitry Dzhus (@dzhus)
  • Niklas Hambüchen (@nh2)
  • Facundo Domínguez (@facundominguez)
  • Daniel Vianna (@dmvianna)

Introduction

  • The CSVeable typeclass implements the key operations.
  • CSVeable is parameterized on both a stream type and a target CSV row type.
  • There are 2 basic row types and they implement exactly the same operations, so you can chose the right one for the job at hand:
    • type MapRow t = Map t t
    • type Row t = [t]
  • You basically use the Conduits defined in this library to do the parsing from a CSV stream and rendering back into a CSV stream.
  • Use the full flexibility and modularity of conduits for sources and sinks.

Speed

While fast operation is of concern, I have so far cared more about correct operation and a flexible API. Please let me know if you notice any performance regressions or optimization opportunities.

Usage Examples

Example #1: Basics Using Convenience API

{-# LANGUAGE OverloadedStrings #-}

import Data.Conduit
import Data.Conduit.Binary
import Data.Conduit.List as CL
import Data.CSV.Conduit
import Data.Text (Text)

-- Just reverse te columns
myProcessor :: Monad m => Conduit (Row Text) m (Row Text)
myProcessor = CL.map reverse

test :: IO ()
test = runResourceT $
  transformCSV defCSVSettings
               (sourceFile "input.csv")
               myProcessor
               (sinkFile "output.csv")

Example #2: Basics Using Conduit API

{-# LANGUAGE OverloadedStrings #-}

import Data.Conduit
import Data.Conduit.Binary
import Data.CSV.Conduit
import Data.Text (Text)

myProcessor :: Monad m => Conduit (Row Text) m (Row Text)
myProcessor = awaitForever $ yield

-- Let's simply stream from a file, parse the CSV, reserialize it
-- and push back into another file.
test :: IO ()
test = runResourceT $
  sourceFile "test/BigFile.csv" $=
  intoCSV defCSVSettings $=
  myProcessor $=
  fromCSV defCSVSettings $$
  sinkFile "test/BigFileOut.csv"