Jump to content

Editing SEO Class Advice


petecube

Recommended Posts

Hi, I am currently working on a new store for a client the store is located within their main web site and resides within a sub directory named /store/ I have SEO friendly URL's enabled and I update the sitemap as I am working on the store, at the moment there are no link from any of the main pages within the site to the store so visitors to the site cannot browse the store while work is in progress and suffer broken or miss formed pages.

 

The other I noticed a link to the store had appeared in the search results despite the fact I have set a rule to deny access to the /store/ directory in the main sites robots file, I then realised that each time I rebuilt the stores site map it must be pinging Google providing an open invitation to crawl the directory.

 

When I checked the seo.class.php class I see there is an if construct containing the logic to ping Gooogle after rebuilding or building the site map.

 

  if (file_put_contents($filename, $mapdata)) {
   // Ping Google
   $request = new Request('www.google.com','/webmasters/sitemaps/ping');
   $request->setMethod('get');
   $request->setData(array('sitemap' => $store_url.'/'.basename($filename)));
   $request->send();
   return true;
  }

 

So I am wondering if it will be OK to remove the whole if construct or could this cause a problem?

 

Pinging Google after building the sitemap for the first time or after rebuilding is redundant Google's bots will find the xml site map if it exists.

 

In the first instance Google are constantly querying the DNS servers around the planet to find fresh (meat) domains to crawl and index, when I place a holding page on a new domain along with a robots file and an xml site map and then a few days later when I add the domain to my webmaster tools account Google has already crawled the domain and indexed & cached the holding page.

 

Which clearly demonstrates that pinging webmaster tools is redundant and a waste of time, if the robots file and xml site map exists and the robots file contains the URL for the site map along with any other instructions Google's bots will do the right thing and there is no need to bug them, the bots crawl  domains on average every 6 seconds, so it does not take long for them to become aware the xml site map has changed or been updated.

 

Any advice will be much appreciated.

Link to comment
Share on other sites

I removed the if() construct from the sitemap() function and it works fine after reviewing the logic/code within the function I realised it was safe to do so, simply because the logic within the if() construct is basically walled off from the rest of the logic within the sitemap() function, so it can be safely removed and won't cause any issues and the site map is rebuilt as expected.

 

Problem solved control regained...

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...