Jump to content

chiggs2

Member
  • Posts

    9
  • Joined

  • Last visited

chiggs2's Achievements

Newbie

Newbie (1/14)

3

Reputation

  1. We can stop the attack no problem but by blocking the spoof google bot we are blocking the real one so the website is now suffering a ranking drop, infact it's more of an exclusion at the moment. If anyone knows how to block the spoof google bots and allow the real one through to spider the site again please let me know.
  2. UPDATE Modified google bot code in .httaccess file to following without any seo rewrite code in. #Fake Googlebot RewriteCond %{HTTP_USER_AGENT} Googlebot RewriteCond %{REMOTE_ADDR} !^64.233. [OR] RewriteCond %{REMOTE_ADDR} !^66.102. [OR] RewriteCond %{REMOTE_ADDR} !^66.249. [OR] RewriteCond %{REMOTE_ADDR} !^72.14. [OR] RewriteCond %{REMOTE_ADDR} !^74.125. [OR] RewriteCond %{REMOTE_ADDR} !^209.85. [OR] RewriteCond %{REMOTE_ADDR} !^216.239. RewriteRule .* - [F] Tested this on a new install and google spiders the site OK, but it doesn't spider the site with the original problem. Host think there is a problem in cubecart code "If this is working on another site it would suggest something is not right with the script on account as this is the only other thing that could be causing a problem. The code is not blocking genuine google requests which means the backend script with the redirect issue is causing the fault." When I test the sitemap in webmaster tools I get General HTTP error: HTTP 403 error (Forbidden) HTTP Error: 403
  3. Thanks for the replies. The fake google bots were identified by the following 123.185.5.39 - - [06/Dec/2013:12:15:02 +0000] "GET / HTTP/1.0" 500 - "http://www.ourwebsite.com" "Mozilla/5.0 (compatible; Googlebot/2.0; +http://www.google.com/bot.html)" 115.58.71.181 - - [06/Dec/2013:12:15:02 +0000] "GET / HTTP/1.1" 500 - "http://www.ourwebsite.com" "Mozilla/5.0 (compatible; Googlebot/2.0; +http://www.google.com/bot.html)" 42.184.5.150 - - [06/Dec/2013:12:15:02 +0000] "GET / HTTP/1.1" 500 - "http://www.ourwebsite.com" "Mozilla/5.0 (compatible; Googlebot/2.0; +http://www.google.com/bot.html)" 222.142.191.1 - - [06/Dec/2013:12:15:02 +0000] "GET / HTTP/1.1" 500 - "http://www.ourwebsite.com" "Mozilla/5.0 (compatible; Googlebot/2.0; +http://www.google.com/bot.html)" 221.130.29.184 - - [06/Dec/2013:12:15:02 +0000] "GET / HTTP/1.0" 500 - "http://www.ourwebsite.com" "Mozilla/5.0 (compatible; Googlebot/2.0; +http://www.google.com/bot.html)" Our current .httpaccess file is as follows RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^ourwebsite.com$ RewriteRule ^(.*)$ http://www.ourwebsite.com/$1 [R=301,L] #RewriteCond %{HTTP_HOST} ^ourwebsite.com$ #RewriteRule ^/?$ "http://www.ourwebsite.com/" [R=301,L] #SEO Rules RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !=/favicon.ico RewriteRule ^(.*).html?$ index.php?seo_path=$1 [L,QSA] #Fake Googlebot RewriteCond %{HTTP_USER_AGENT} Googlebot RewriteCond %{REMOTE_ADDR} !^66.249. [OR] RewriteCond %{REMOTE_ADDR} !^216.229. RewriteRule .* - [F] <Files 403.shtml> order allow,deny allow from all </Files> This is blocking the spoof bots but also appears to be blocking the real google bot. We have a vps server and the support we are getting is pretty quick but the problem of our site not being spidered persists. The hosting company have also identified this problem [sun Dec 08 07:48:45.292558 2013] [core:error] [pid 22888] [client 121.231.234.151:55566] AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace., referer: www.ourwebsite.com and suggest that it may not be helping the situation, I'm out of my depth with this, have no idea what that can be. The fake bots are blocked just need to get google spidering the site again.
  4. Our website has had a discruption of service attack which is still ongoing from a spoof google bot. Managed to stop the attack bringing the server down by blocking it but now google won't spider the website and we've been dumped from the search results. What I am looking for is a way to stop the spoof google bot but still allow the real google bot access to the site to spider it. The fake bot is only attacking the homepage. I'm guessing this can be done in the .httaccess file but as yet our hosting company haven't been able to come up with the solution. Hoping someone has some info as this is obviously disasterous for our business, especially over the christmas period as we rely on google for our traffic.
  5. Yes, very strange. I will have another look at it. If it works then it should solve the problem. Thanks for the post.
  6. I have tried setting up the categories but it still adds the VAT to the order. A decent shipping mod and a VAT mod should be standard kit on Cubecart by now. There must have been many requests for these features in the past. Most B to B websites that trade with the EU will need to exempt customers from VAT, Magento and Open Cart have this facility.The mod works great on V3, been using it for several years now yet it is still not built in to V5.
  7. Thank you. I can see how that could possibly be a solution. There are several thousand products in the store so to set these up manually would be a lot of work. I will play about with a dummy installation and see if I can find where the data is stored in the database then I may be able to run a query to update all the products.
  8. Thanks for the reply. It is the customer that I need to exempt. If they are registered for VAT in their own country I have to exempt them from paying VAT here in the UK on any orders that they make. I have had a look and it is possible to group customers but I can't see any settings that will allow me to exempt them from paying VAT. If you know how I could do that it would be very helpful.
  9. I need to exempt some orders from charging VAT. I can do this with V3 as I have a mod to handle it. I would like to upgrade this store to V5 but can't as the VAT mod is really needed. Has anyone else come across this problem with V5? Does anyone know a solution?
×
×
  • Create New...