Search the Community
Showing results for tags 'Allow Disallow'.
-
Hello, I am running version 5.1.1. And I am currently working on my overall search engine discoverability. I note from looking at my robots.txt analysis in google webmaster tools that I have some crawl errors and I don't get it. I am using the robots.txt file that cubecart generates automatically as I assumed that this was the most appropriate configuration for the site? Analysis results: Line 1: Sitemap: sitemap.xml.gz Invalid sitemap URL detected; syntax not understood Line 2: Sitemap: sitemap.xml Invalid sitemap URL detected; syntax not understood Line 4: Disallow: cache/ No user-agent specified Line 5: Disallow: images/ No user-agent specified The disallow cache makes sense as pages change frequently. But I am wondering what the reasoning is behind the blocking of images? Also I cant figure out why the sitemap syntax is not understood. The path to the file is good as I can access the sitemap via my browser? Can anyone throw some light on this? Thanks in advance.