Debian logoDebian Screenshots >

libwww-robotrules-perl

database of robots.txt-derived permissions

Description

WWW::RobotRules parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>. Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site.

The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.

Upload more screenshots

Please help extend the collection of screenshots. Just make a screenshot and upload it here. You don't need to register or anything.


Homepage

https://metacpan.org/release/WWW-RobotRules


Statistics

1570 other people were interested in this package here. The newest known version of this software is 6.02-1 (Information last updated about 12 hours ago.)