Categories
Web Development

Why Does Fetch as Google Tool Display An Error Has Occurred?

Want to Support Me?
Get two free stocks valued up to $1,850 when you open a new Webull investment account through my referral link and fund the account with at least $100!

Twice in the last few weeks, I submitted multiple links using the Fetch as Google tool to make sure certain pages were either added or updated in the Google index, and while doing so, I began receiving the dreaded message “An error has occurred. Please try again later.”

The Problem

In the past, Google allotted a quota of X number of index requests per month. But evidently, the system has changed, and there is no longer any quota listed when hovering over the two crawl options:

There is no longer a crawl quota in the Fetch as Google tool.

It seemed like I was only able to get through about 10 index request submissions before I started getting the error message, so I switched to a different account to start over and keep count.

I was able to submit 9 index requests before I got the “An error occurred” message on the 10th request. A user on a Webmaster Central Help Forum post indicated nearly the same limit:

There is a global issue in Google search console with submitting URLs as you have described. Will let you submit 10 URLs (per day) then error message appears.

Note that you can submit some additional requests using the “Crawl this URL and its direct links” option, even after hitting the limit using the “Crawl only this URL” option, so there must be a separate limit for both.

I did try switching the crawl device from “Desktop” to “Mobile: Smartphone”, but it did not allow me to submit any additional requests.

According to the above linked forum post, users pointed out that the Fetch as Google tool was never meant to get a whole bunch of pages indexed by Google. They have all the automated means in the world to crawl sites and discover new or updated content, so I can see that. All the same, it’s still nice to immediately give Google knowledge of a change I made and see it updated quickly on SERPs.

I guess what that boils down to is that we should never have gotten used to submitting so many links through that tool, if that’s truly not what Google intended it to be.

Solutions

So where does that leave us?

  1. You can still submit 9-10 URLs per 24-hour period and per crawling option (“Crawl only this URL” orĀ “Crawl this URL and its direct links”), for a total of 18-20 requests per day.
  2. You can give additional users access to your property, and when they access the Fetch as Google tool, they can submit additional URLs via their Search Console account.
  3. If you have new URLs you want to let Google know about, make sure they are in your sitemap.xml file and resubmit it to Google.
  4. If you have a bunch of pages on your site that have changed, and you want to let Google know about the changes, you can no longer submit the URLs individually, since you will quickly hit the 9-10 limit. A way around this would be to create a separate web page with links to the updated pages, then submit an index request for that page using the “Crawl this URL and its direct links” option.

Your Turn!

Did you land here because you’ve received this error too? What solution did you use to get past this issue? Let’s discuss below in the comments!

Want to Support Me?
Get two free stocks valued up to $1,850 when you open a new Webull investment account through my referral link and fund the account with at least $100!