parsero

Audit tool for robots.txt of a site

Parsero is a free script written in Python which reads the Robots.txt file of a web server through the network and looks at the Disallow entries. The Disallow entries tell the search engines what directories or files hosted on a web server mustn't be indexed. For example, "Disallow: /portal/login" means that the content on www.example.com/portal/login it's not allowed to be indexed by crawlers like Google, Bing, Yahoo... This is the way the administrator have to not share sensitive or private information with the search engines.

libclang1-10

C interface to the Clang library

liblldb-10

Next generation, high-performance debugger, library

libllvm10

Modular compiler and toolchain technologies, runtime library

libomp5-10

LLVM OpenMP runtime

mailutils-comsatd

GNU mailutils-based comsatd daemon

GNU Mailutils is a rich and powerful protocol-independent mail framework. It contains a series of useful mail libraries, clients, and servers.