Jump to content

Noodleman

Member
  • Content Count

    643
  • Joined

  • Last visited

  • Days Won

    23

Everything posted by Noodleman

  1. It's a little better, but still lots of room to improve. for some of my customers we've actually throw out the standard search and totally rewritten it to suit their specifications.
  2. Token is also added to all submitted forms by CubeCart javascript. sounds like the module itself will need an update to somehow work around this.
  3. send them an invoice via your payment system and manually create a new order?
  4. just get free ones from lets-encrypt unless your host doesn't support it. in which case, suggest moving hosting as it would save you £££ over the year
  5. check cookie domain is valid in store settings. this ALWAYS screws up my local installs Also, check your web server error log, (not CubeCart one), there may be some clues. If it worked in FF, but no other browsers, pretty sure it's cookie domain related. try adding the site to your list of trusted sites, might bypass the cookie checks.
  6. an API will be provided by your POS system/service, check with them first. Which POS are you using?
  7. Sounds like session data issues. Try and change the session path and confirm. Also check cookie domain is valid
  8. Afternoon, that rules out my initial thought. The module I mentioned may, under some rare scenarios do similar to what you have described. The recall of the cart (if in CubeCart_saved_cart) is reasonably simple. however it does have a dependency on the customer session data still existing on the server. perhaps your web server is clearing session files/content on a schedule. most servers do it on a re-boot, some scheduled etc. I think that may explain the issue.
  9. that's how it is meant to work.
  10. rebuild the sitemap, then download the sitemap compressed file from your store and validate it's content. it's possible you hit a memory limit of timeout error when building the sitemap, so if the map is empty, cross reference PHP error logs for related information-
  11. Try this: https://www.cubecart.com/extensions/plugins/price-list-plugin
  12. UPDATE: forgot to actually push the "submit" button on this post yesterday.... DOH Reporting back.. something is still not right for sure. So I've been writing data to the log all day, the cache has been cleared at least twice but the number of actual writes doesn't add up. In 9 hours, 377,000 new file writes. But, I am seeing a lot of duplicate hashes being written, an example of this is: even if caching WAS working correctly, and cache was cleared we should NOT have written the same item to cache 8442 times since this morning. Most things in the log do appear to be duplicated many thousands of times. Assuming the overwriting of cache is working correctly, this is incorrect and will add to increased IO for file cache. Here is my log amendment for reference: modified _writeCache function: protected function _writeCache($data, $query) { $query_hash = md5($query); if (isset($GLOBALS['cache']) && is_object($GLOBALS['cache'])) { $fp = fopen('query_log.txt', 'a+'); fwrite($fp, time() . " ### " . $query_hash . "\r\n"); fclose($fp); return $GLOBALS['cache']->write($data, 'sql.'.$query_hash); } return false; } maybe I did something wrong... but, initial results suggest cache is being written mode than it should. I'll need to check the write function to see if it does a check first. won't have time until this evening.
  13. I'll make some changes and report back I'll move the logging location and then also capture an MD5 hash of the query string.
  14. I'm still not 100% convinced this is the only issue, or I have simply vastly underestimated the amount of content cache will generate. from checking the cache directory this evening, I can see almost 60,000 files. I've sorted these based on date/time and the earliest timestamp is 09:04AM today, so we can conclude that the cache was cleared around that time. I've randomly searched for duplicate queries, simply by picking randomly some of the lines from the log file and searching for the same string (Thankyou Notepad++ for being amazing) I'm finding duplicates in the log with timestamps after the cache clear time. here is an example: It's possible this is legit, but raises the question, Shouldn't this only be cached once? it's being re-cached. I'm assuming this is because the cached object expired thus it re-cached. however in this situation do the OLD files associated with the old cached object get removed when the new cached item is created?
  15. That's purely for debug on my test instance
  16. The cache got cleared by the admin today before I had chance to review the overall totals, since yesterday it managed to write 34Mb of data to the log file which came to (rounded) 200,000 items written to cache. From a crude "pick a random line and search for it" technique, I can see that cached content is being duplicated, BUT! I can't rule out this was due to the clear cache button being used at this time. I'm going to need more time to monitor and review.
  17. Seems to have levelled off the cache at around 35,000 objects at the moment, will keep an eye and report back later.
  18. That's definitely helped. I've also wrapped the log write with the same validation. I'm seeing a lot less cache content being created. I'm monitoring for a while and will update later. Thanks Al
  19. Just set this up and I immediately see a problem coming from some modules. SQL queries which have specifically been run to NOT cache are being written to cache. $GLOBALS['db']->query($sql, false, 0, false); Based on around 5 minutes of collecting data, I'm already seeing 3,500 of these non-cached queries being written to cache.
  20. that seems reasonable. I'll probably go with a cache log. would you mind giving a couple pointers on the correct location for it? Would save me some trial and error
  21. I'm sorting out the same problem this morning for somebody else. The cache has grown so large that CubeCart can't handle the deletion. It's throwing an out of memory error. To fix I've manually cleared the /cache/ directory from the server (I have full access to server via SSH).
×
×
  • Create New...