Jump to content

Noodleman

Member
  • Content Count

    673
  • Joined

  • Last visited

  • Days Won

    25

Everything posted by Noodleman

  1. take a backup of your database and all files before doing any upgrade.
  2. Unfortunately not, I've been super busy recently as well as moving house/offices. It is still on my R&D roadmap, but I've got more urgent customer projects to complete first before I can get to this. At this time, I can't give an estimate of when I would be able to carve out enough time to fit this module in, but it WILL get done...
  3. Change your PHP version (CC3 won't work correctly), upload the latest CubeCart files and then run the installer. It will upgrade.
  4. The product add-ons module can do this but may require some custom skin changes
  5. It's a little better, but still lots of room to improve. for some of my customers we've actually throw out the standard search and totally rewritten it to suit their specifications.
  6. Token is also added to all submitted forms by CubeCart javascript. sounds like the module itself will need an update to somehow work around this.
  7. send them an invoice via your payment system and manually create a new order?
  8. just get free ones from lets-encrypt unless your host doesn't support it. in which case, suggest moving hosting as it would save you £££ over the year
  9. check cookie domain is valid in store settings. this ALWAYS screws up my local installs Also, check your web server error log, (not CubeCart one), there may be some clues. If it worked in FF, but no other browsers, pretty sure it's cookie domain related. try adding the site to your list of trusted sites, might bypass the cookie checks.
  10. an API will be provided by your POS system/service, check with them first. Which POS are you using?
  11. Sounds like session data issues. Try and change the session path and confirm. Also check cookie domain is valid
  12. Afternoon, that rules out my initial thought. The module I mentioned may, under some rare scenarios do similar to what you have described. The recall of the cart (if in CubeCart_saved_cart) is reasonably simple. however it does have a dependency on the customer session data still existing on the server. perhaps your web server is clearing session files/content on a schedule. most servers do it on a re-boot, some scheduled etc. I think that may explain the issue.
  13. that's how it is meant to work.
  14. rebuild the sitemap, then download the sitemap compressed file from your store and validate it's content. it's possible you hit a memory limit of timeout error when building the sitemap, so if the map is empty, cross reference PHP error logs for related information-
  15. Try this: https://www.cubecart.com/extensions/plugins/price-list-plugin
  16. UPDATE: forgot to actually push the "submit" button on this post yesterday.... DOH Reporting back.. something is still not right for sure. So I've been writing data to the log all day, the cache has been cleared at least twice but the number of actual writes doesn't add up. In 9 hours, 377,000 new file writes. But, I am seeing a lot of duplicate hashes being written, an example of this is: even if caching WAS working correctly, and cache was cleared we should NOT have written the same item to cache 8442 times since this morning. Most things in the log do appear to be duplicated many thousands of times. Assuming the overwriting of cache is working correctly, this is incorrect and will add to increased IO for file cache. Here is my log amendment for reference: modified _writeCache function: protected function _writeCache($data, $query) { $query_hash = md5($query); if (isset($GLOBALS['cache']) && is_object($GLOBALS['cache'])) { $fp = fopen('query_log.txt', 'a+'); fwrite($fp, time() . " ### " . $query_hash . "\r\n"); fclose($fp); return $GLOBALS['cache']->write($data, 'sql.'.$query_hash); } return false; } maybe I did something wrong... but, initial results suggest cache is being written mode than it should. I'll need to check the write function to see if it does a check first. won't have time until this evening.
  17. I'll make some changes and report back I'll move the logging location and then also capture an MD5 hash of the query string.
  18. I'm still not 100% convinced this is the only issue, or I have simply vastly underestimated the amount of content cache will generate. from checking the cache directory this evening, I can see almost 60,000 files. I've sorted these based on date/time and the earliest timestamp is 09:04AM today, so we can conclude that the cache was cleared around that time. I've randomly searched for duplicate queries, simply by picking randomly some of the lines from the log file and searching for the same string (Thankyou Notepad++ for being amazing) I'm finding duplicates in the log with timestamps after the cache clear time. here is an example: It's possible this is legit, but raises the question, Shouldn't this only be cached once? it's being re-cached. I'm assuming this is because the cached object expired thus it re-cached. however in this situation do the OLD files associated with the old cached object get removed when the new cached item is created?
  19. That's purely for debug on my test instance
  20. The cache got cleared by the admin today before I had chance to review the overall totals, since yesterday it managed to write 34Mb of data to the log file which came to (rounded) 200,000 items written to cache. From a crude "pick a random line and search for it" technique, I can see that cached content is being duplicated, BUT! I can't rule out this was due to the clear cache button being used at this time. I'm going to need more time to monitor and review.
×
×
  • Create New...