parsero

Audit tool for robots.txt of a site

Description

Parsero is a free script written in Python which reads the Robots.txt file of a web server through the network and looks at the Disallow entries. The Disallow entries tell the search engines what directories or files hosted on a web server mustn't be indexed. For example, "Disallow: /portal/login" means that the content on www.example.com/portal/login it's not allowed to be indexed by crawlers like Google, Bing, Yahoo... This is the way the administrator have to not share sensitive or private information with the search engines.

Parsero is useful for pentesters, ethical hackers and forensics experts. It also can be used for security tests.

Upload more screenshots

Please help extend the collection of screenshots. Just make a screenshot and upload it here. You don't need to register or anything.

Upload a screenshot

Hint: upload an image here from your clipboard with Ctrl-V


Homepage

https://github.com/behindthefirewalls/Parsero


Install this software package

If the package is available for the distribution you are currently using on your computer then install the software by clicking on…

Install parsero