masterunix Posted May 21, 2014 Share Posted May 21, 2014 Hello All Cubecart members, I have a question ... As default Cubecart is placing a robots.txt in the root folder that disallow to index the images folder. Can someone explain why? Is it good to remove the robots.txt from the root folder of Cubecart? And do i need to have a images sitemap generator? Or will Google index them fine? Thanks for your help. Quote Link to comment Share on other sites More sharing options...
bsmither Posted May 21, 2014 Share Posted May 21, 2014 "Can someone explain why a robots.txt in the root folder that disallows to index the images folder?" I have no solid reason, but maybe some store owners do not, or cannot because of copyright reasons, want the images picked up by search engines. And as to why Devellion makes this stance as default, again, I have no solid reason. "Is it good to remove the robots.txt file?" I would say it is better to have the file and learn how to use it to maximize your store's visibility according to your business requirements. "Do i need to have a images sitemap generator? Or will search engines index them fine?" If not disallowing access to these folders, search engines are quite capable of finding everything on your site -- provided there is a link on a page it has already scanned. Quote Link to comment Share on other sites More sharing options...
havenswift-hosting Posted August 28, 2014 Share Posted August 28, 2014 Hi Unfortunately the robots.txt file is part of the distribution so if you change this - it gets over-written when you upgrade ! Al is aware of this and has said he will remove it in a future release. You should definitely have a robots.txt file - and it should be changed from the current default one ! The images directory (at least the images/source) is one that should be accessible or you are preventing Google and other search engines from indexing your images ! I would strongly recommend having an images sitemap, along with a video sitemap and a news sitemap (if your site is authorised by Google for this). Each of these should then be submitted via your Google (and Bing) webmaster accounts. If this wasnt important to Google then they wouldnt encourage you to submit them via their webmaster account and in fact Google have said that having a valid, up to date sitemap, helps their crawler to index the site more quickly and easily. Anything that makes it easier for Google to index your site has got to be good and there have been experiments done showing that more pages get indexed more quickly if you have a valid sitemap. This doesnt mean that Google wont eventually find all pages of your site just that it may take a lot longer ! We resell what is probably the best sitemap generation software which does all of the above and masses more ! Ian Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.