Crawl Budget
The number of pages Googlebot will crawl on your site within a specific timeframe.
Crawl budget is the number of pages search engine crawlers, such as Googlebot, will visit and index on your website within a specific timeframe. For most small to medium care home websites, crawl budget typically meets all requirements as Google has sufficient resources to crawl every page. However, it becomes critical for larger sites or those frequently updated with new content.
Efficient management of crawl budget ensures that Google spends its time on your most valuable and relevant pages. This is achieved by preventing crawlers from wasting resources on low-value URLs, such as duplicate content, broken links, or admin sections. A well-configured robots.txt file and a clean XML sitemap are the primary tools for directing Google's attention to the pages that drive enquiries.
If Google encounters too many technical errors or slow-loading pages, it may reduce your site's crawl budget, meaning new or updated content takes longer to appear in search results. Monitoring crawl activity through Google Search Console helps identify any bottlenecks. Maintaining a technically sound website ensures your authority signals are processed promptly by search engines.