How to Get Google to Find New Web Page URL on Website

Google has updated their URL submission pages. According to the Google Webmaster Central Blog, URLs that have been submitted with “Fetch as Googlebot” are crawled within 24 hours.

What is Fetch as Googlebot?

It’s the new Google URL submission tool. Webmasters now have the ability to directly request that Google send the Googlebot to the submitted URL.

What is the Googlebot?
Googlebot is the name of Google’s search engine spider that crawls the internet and indexes discovered URLs. The Googlebot, or Gbot for short, will usually find your new web pages really quickly, especially compared to the 30 days wait that used to occur a few years ago. However, sometimes for whatever reason your pages may not be getting indexed. If this is the case, you may want to manually submit your pages to the Fetch as Google tool.

How to Submit Web Page URLs to Google
First login to Google Webmaster Tools. Then follow these instructions via Google Webmaster Tools blog:

First, use Diagnostics > Fetch As Googlebot to fetch the URL you want to submit to Google. If the URL is successfully fetched you’ll see a new “Submit to index” link appear next to the fetched URL

Once you click “Submit to index” you’ll see a dialog box that allows you to choose whether you want to submit only the one URL, or that URL and all its linked pages.

When submitting individual URLs, we have a maximum limit of 50 submissions per week; when submitting URLs with all linked pages, the limit is 10 submissions per month.

Here’s an excellent video by Google engineer Matt Cutts that discusses how the Gbot spider works.

{ 0 comments… add one now }

Leave a Comment

Previous post:

Next post: