Powerful website checker

CrawlForMe validates more links than any other tool.

It analyzes all the links found in HTML files, javascript files and in CSS files. It also checks for images, PDFs and any other documents.

It follows any redirects and checks any links it encounters. It detects if some links use HTTP protocol instead of HTTPS in secured websites.

You can schedule tasks and e-mail alerts, and then receive regular reports direct to your mailbox.

Highly configurable crawler

CrawlForMe allows you to fine tune your crawl.

You may:

  • use or not use robots.txt,
  • define any ignored pattern, and exclude any kind of links,
  • define cookie values,
  • access secure parts of your websites,
  • fill in and submit forms.

It is also possible to customize the speed and the depth of the crawl.

Link validator

Our helpful correction assistant creates detailed lists of broken, ignored or validated links.

It spots broken links within source code.

When comparing reports, it highlights changes for ease of maintenance.

At any time, you may add new seeds and new websites to analyze.

Statistics and shared reports

You can access statistics on:

  • status distribution (successful, error, redirect)
  • resources types and sizes
  • HTTP code distribution (301, 302, 303, 404)

Reports are generated online, and can of course be exported to CSV (Excel).

You can add your own logo.

You can share your reports amongst your team, and comment on them online.

If needed, you may ask our team for support.

Our latest news.

They trust us: