Jump to content

Search the Community

Showing results for tags 'Allow Disallow'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • CubeCart News & Announcements
    • News & Announcements
  • CubeCart Support Forums
    • Issue / Bug Reporting & Feature Requests
    • Install & Upgrade Support
    • Official CubeCart Hosting
    • Technical Help
    • Customising Look & Feel
  • CubeCart Extension Marketplace
    • Visit the CubeCart Extension Marketplace
    • Extension Discussion
    • Developer Forum
  • General
    • General Discussion
    • Show Off

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Location

Found 1 result

  1. Hello, I am running version 5.1.1. And I am currently working on my overall search engine discoverability. I note from looking at my robots.txt analysis in google webmaster tools that I have some crawl errors and I don't get it. I am using the robots.txt file that cubecart generates automatically as I assumed that this was the most appropriate configuration for the site? Analysis results: Line 1: Sitemap: sitemap.xml.gz Invalid sitemap URL detected; syntax not understood Line 2: Sitemap: sitemap.xml Invalid sitemap URL detected; syntax not understood Line 4: Disallow: cache/ No user-agent specified Line 5: Disallow: images/ No user-agent specified The disallow cache makes sense as pages change frequently. But I am wondering what the reasoning is behind the blocking of images? Also I cant figure out why the sitemap syntax is not understood. The path to the file is good as I can access the sitemap via my browser? Can anyone throw some light on this? Thanks in advance.
×
×
  • Create New...