lda: Online Latent Dirichlet Allocation

[ bsd3, library, natural-language-processing ] [ Propose Tags ] [ Report a vulnerability ]

Online Gibbs sampler for Latent Dirichlet Allocation. LDA is a generative admixture model frequently used for topic modeling and other applications. The primary goal of this implementation is to be used for probabilistic soft word class induction. The sampler can be used in an online as well as batch mode.


[Skip to Readme]

Downloads

Maintainer's Corner

Package maintainers

For package maintainers and hackage trustees

Candidates

  • No Candidates
Versions [RSS] 0.0.1, 0.0.2
Dependencies base (>=3 && <5), containers (>=0.4), ghc-prim (>=0.2), mtl (>=2.0), random-fu (>=0.2.1.1), random-source (>=0.3.0.2), rvar (>=0.2), vector (>=0.9) [details]
License BSD-3-Clause
Author Grzegorz Chrupała
Maintainer pitekus@gmail.com
Category Natural Language Processing
Home page https://bitbucket.org/gchrupala/colada
Uploaded by GrzegorzChrupala at 2012-02-29T18:09:09Z
Distributions
Reverse Dependencies 1 direct, 0 indirect [details]
Downloads 1801 total (7 in the last 30 days)
Rating (no votes yet) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs uploaded by user
Build status unknown [no reports yet]

Readme for lda-0.0.2

[back to package description]