llm-with-context: Typified interactions with LLMs

[ ai, library, mit ] [ Propose Tags ] [ Report a vulnerability ]

Using Proxy and StateT we manage a typed conversation with conversation history. We can also perform more customized qqueries of history to be included in the next response.

Downloads

Note: This package has metadata revisions in the cabal description newer than included in the tarball. To unpack the package including the revisions, use 'cabal get'.

Maintainer's Corner

Package maintainers

For package maintainers and hackage trustees

Candidates

  • No Candidates
Versions [RSS] 0.1.0.0
Change log CHANGELOG.md
Dependencies aeson (>=2.2.3 && <2.3), base (>=4.19.0 && <4.21), bytestring (>=0.12.2 && <0.13), containers (>=0.7 && <0.8), data-default (>=0.8.0 && <0.9), directory (>=1.3.8 && <1.4), http-client (>=0.7.19 && <0.8), http-client-tls (>=0.3.6 && <0.4), http-types (>=0.12.4 && <0.13), parsec (>=3.1.18 && <3.2), scrappy-core (>=0.1.0 && <0.2), text (>=2.1.3 && <2.2), transformers (>=0.6.1 && <0.7) [details]
License MIT
Author lazyLambda
Maintainer galen.sprout@gmail.com
Uploaded by lazyLambda at 2026-02-08T21:08:39Z
Revised Revision 2 made by lazyLambda at 2026-02-09T17:07:46Z
Category AI
Source repo head: git clone https://github.com/augyg/llm-with-context
Distributions
Downloads 2 total (2 in the last 30 days)
Rating (no votes yet) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs available [build log]
Last success reported on 2026-02-09 [all 4 reports]