All Collections
FAQ
CMS Rate Limiting on Bots and Crawlers
CMS Rate Limiting on Bots and Crawlers

Some CMS providers set rate limits on crawlers that can affect the Monsido scan process

Updated over a week ago

Certain CMS providers may implement a rate limit on all website bots and crawlers. This affects Monsido users because it slows our crawler’s ability to scan customer websites. Any other web governance/accessibility software that scans websites is also affected by this.

Article Navigation

Known CMS Providers that Limit Bots and Crawlers

Current CMS providers applying rate limits include:

  • Shopify.

Recommended Solutions

We recommend the following actions:

  • Make sure the scan speed is set to slow.

    See the User Guide Article for instructions:

  • Change the scan frequency to bi-weekly or monthly.

    See the User Guide Article for instructions:

  • Apply path constraints and/or excludes to reduce the scope of the scans. For product websites, you can often exclude variations of the same product and only scan the root product page.

    See the User Guide Article for instructions:

  • Split up the domain into multiple scans. Monsido CX can help determine if this is a good option, and can assist with the domain split.

    Contact the Monsido support team if you feel like this is the best option for you.

Additional Resources

This section gives some links to more resources that can be useful for this topic.

Did this answer your question?