Link Issues
Finding broken links on your website is a time-consuming task. Vigilant automates this by periodically going through your website and clicking all links. It will report back to you the links that contain issues. Issues are, for example, 404 pages and pages that give an error. Any page that returns a status code higher than or equal to 400 is considered an issue.
In order to quickly identify where the bad link is coming from, Vigilant will store the page it has been found on.
Vigilant's Link Issues will send you a notification when:
The crawler is finished and has found issues
Vigilant's crawler will only crawl internal links.
Setting up a crawler
There are two ways of setting up a crawler, through a website or individually.
When setting up via a website you can enable the crawler on the Link Issues
tab on the site's edit page.
You can also set up individual crawlers for websites by clicking Link Issues
in the menu and then clicking the Add
button on the top right.
Crawler settings
Start URL
Every crawler has to have a start URL. This is where Vigilant will send the first request to in order to gather additional links.
Sitemaps
You may add sitemaps to speed up the crawling process.
Schedule
Define when the crawler runs
A note on rate limiting
Many websites rate limit traffic. When Vigilant receives a 429
rate-limited response it will stop crawling. Please whitelist Vigilant.