robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Version: 0.6.0
Depends: R (≥ 3.0.0)
Imports: stringr (≥ 1.0.0), httr (≥ 1.0.0), spiderbar (≥ 0.2.0), future (≥ 1.6.2), magrittr, utils
Suggests: knitr, rmarkdown, dplyr, testthat, covr
Published: 2018-02-11
Author: Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]
Maintainer: Peter Meissner <retep.meissner at>
License: MIT + file LICENSE
NeedsCompilation: no
Materials: README NEWS
CRAN checks: robotstxt results


Reference manual: robotstxt.pdf
Vignettes: using_robotstxt
Package source: robotstxt_0.6.0.tar.gz
Windows binaries: r-devel:, r-release:, r-oldrel:
OS X El Capitan binaries: r-release: robotstxt_0.6.0.tgz
OS X Mavericks binaries: r-oldrel: robotstxt_0.5.2.tgz
Old sources: robotstxt archive

Reverse dependencies:

Reverse imports: seoR
Reverse suggests: spiderbar


Please use the canonical form to link to this page.