servant-seo: Generate Robots.txt and Sitemap.xml specification for your servant API.

[ bsd3, library, servant, web ] [ Propose Tags ]


Maintainer's Corner

Package maintainers

For package maintainers and hackage trustees


Versions [RSS] 0.1.0, 0.1.1, 0.1.2
Change log
Dependencies aeson, base (>=4.7 && <5), binary, blaze-markup, bytestring, containers, http-media, lens (>=4.18.1), servant (>=0.16), servant-blaze, servant-server, text, warp, xml-conduit [details]
License BSD-3-Clause
Copyright Andrey Prokopenko (c) 2020
Author Andrey Prokopenko
Category Web, Servant
Home page
Bug tracker
Source repo head: git clone
Uploaded by swamp_agr at 2020-07-15T21:14:57Z
Distributions NixOS:0.1.2
Downloads 452 total (9 in the last 30 days)
Rating (no votes yet) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs uploaded by user
Build status unknown [no reports yet]

Readme for servant-seo-0.1.2

[back to package description]


Build Status

Hackage Status


  • Add servant-seo to project dependencies.


  1. Restrict API with Disallow combinator to prevent robots making requests.
type ProtectedAPI = Disallow "admin" :> Get '[HTML] AdminPage

type StaticAPI = "cdn" :> Disallow "static" :> Raw
  1. Add Priority and Frequency (optional step).
typw BlogAPI = "blog" :> Frequency 'Always :> Get '[HTML] BlogPage
  1. Extend your API with something like:
Warp.runSettings Warp.defaultSettings $ serveWithSeo website api server
  website = ""

You will have /robots.txt and /sitemap.xml automatically generated and ready to be served with your API.

See Servant.Seo module for detailed description and example/Example.hs for show case.


This library would not have happened without these people. Thank you!

  • @fizruk and Servant Contributors: as source of inspiration.
  • @isovector: for Thinking with types book.