Change Googlebot crawl rate
The term crawl rate means how many requests per second Googlebot makes to your site when it is crawling it: for example, 5 requests per second.
You cannot change how often Google crawls your site, but if you want Google to crawl new or updated content on your site, you should use Fetch as Google instead.
Google has sophisticated algorithms that determine the optimal crawl speed for a site. Our goal is to crawl as many pages from your site as we can on each visit without overwhelming your server's bandwidth.
If Google is making too many requests per second to your site and slowing down your server, you can limit the crawl rate for root-level sites—for example, www.example.com
and http://subdomain.example.com
. The crawl rate that you set is the maximum crawl rate that Googlebot should make. Note that it does not guarantee that Googlebot will reach this maximum rate.
We recommend against limiting the crawl rate unless you are seeing server load problems that are definitely caused by Googlebot hitting your server too hard.
You cannot change the crawl rate for sites that are not at the root level—for example, www.example.com/folder
.
To limit the crawl rate:
- On the Search Console Home page, click the site that you want.
- Click the gear icon
, then click Site Settings.
- In the Crawl rate section, select the option you want and then limit the crawl rate as desired.
The new crawl rate will be valid for 90 days.
If your crawl rate is described as "calculated as optimal" on the site settings page, the only way to reduce the crawl rate is by filing a special request. You cannot increase the crawl rate.
* Nguồn: Google Search Console