Jump to content


  • Content count

  • Joined

  • Last visited

  • Days Won


Noodleman last won the day on August 9

Noodleman had the most liked content!

Community Reputation

22 Excellent


Profile Information

  • Gender
  • Location

Recent Profile Visitors

7,861 profile views
  1. Noodleman

    No longer able to log into admin

    just get free ones from lets-encrypt unless your host doesn't support it. in which case, suggest moving hosting as it would save you £££ over the year
  2. Noodleman

    No longer able to log into admin

    check cookie domain is valid in store settings. this ALWAYS screws up my local installs Also, check your web server error log, (not CubeCart one), there may be some clues. If it worked in FF, but no other browsers, pretty sure it's cookie domain related. try adding the site to your list of trusted sites, might bypass the cookie checks.
  3. Noodleman

    Find an API to link with a POS

    an API will be provided by your POS system/service, check with them first. Which POS are you using?
  4. Noodleman

    What am I missing ???

    Sounds like session data issues. Try and change the session path and confirm. Also check cookie domain is valid
  5. Afternoon, that rules out my initial thought. The module I mentioned may, under some rare scenarios do similar to what you have described. The recall of the cart (if in CubeCart_saved_cart) is reasonably simple. however it does have a dependency on the customer session data still existing on the server. perhaps your web server is clearing session files/content on a schedule. most servers do it on a re-boot, some scheduled etc. I think that may explain the issue.
  6. are you using the abandoned carts module?
  7. Noodleman

    What am I missing ???

    tried clearing the cache?
  8. Noodleman

    Clear Cache Every-time

    that's how it is meant to work.
  9. Noodleman

    Sitemap issues

    rebuild the sitemap, then download the sitemap compressed file from your store and validate it's content. it's possible you hit a memory limit of timeout error when building the sitemap, so if the map is empty, cross reference PHP error logs for related information-
  10. Noodleman

    List ALL products link - is this possible?

    Try this: https://www.cubecart.com/extensions/plugins/price-list-plugin
  11. Noodleman

    inodes limit exceded

    UPDATE: forgot to actually push the "submit" button on this post yesterday.... DOH Reporting back.. something is still not right for sure. So I've been writing data to the log all day, the cache has been cleared at least twice but the number of actual writes doesn't add up. In 9 hours, 377,000 new file writes. But, I am seeing a lot of duplicate hashes being written, an example of this is: even if caching WAS working correctly, and cache was cleared we should NOT have written the same item to cache 8442 times since this morning. Most things in the log do appear to be duplicated many thousands of times. Assuming the overwriting of cache is working correctly, this is incorrect and will add to increased IO for file cache. Here is my log amendment for reference: modified _writeCache function: protected function _writeCache($data, $query) { $query_hash = md5($query); if (isset($GLOBALS['cache']) && is_object($GLOBALS['cache'])) { $fp = fopen('query_log.txt', 'a+'); fwrite($fp, time() . " ### " . $query_hash . "\r\n"); fclose($fp); return $GLOBALS['cache']->write($data, 'sql.'.$query_hash); } return false; } maybe I did something wrong... but, initial results suggest cache is being written mode than it should. I'll need to check the write function to see if it does a check first. won't have time until this evening.
  12. Noodleman

    inodes limit exceded

    I'll make some changes and report back I'll move the logging location and then also capture an MD5 hash of the query string.
  13. Noodleman

    inodes limit exceded

    I'm still not 100% convinced this is the only issue, or I have simply vastly underestimated the amount of content cache will generate. from checking the cache directory this evening, I can see almost 60,000 files. I've sorted these based on date/time and the earliest timestamp is 09:04AM today, so we can conclude that the cache was cleared around that time. I've randomly searched for duplicate queries, simply by picking randomly some of the lines from the log file and searching for the same string (Thankyou Notepad++ for being amazing) I'm finding duplicates in the log with timestamps after the cache clear time. here is an example: It's possible this is legit, but raises the question, Shouldn't this only be cached once? it's being re-cached. I'm assuming this is because the cached object expired thus it re-cached. however in this situation do the OLD files associated with the old cached object get removed when the new cached item is created?
  14. Noodleman

    inodes limit exceded

    That's purely for debug on my test instance
  15. Noodleman

    inodes limit exceded

    The cache got cleared by the admin today before I had chance to review the overall totals, since yesterday it managed to write 34Mb of data to the log file which came to (rounded) 200,000 items written to cache. From a crude "pick a random line and search for it" technique, I can see that cached content is being duplicated, BUT! I can't rule out this was due to the clear cache button being used at this time. I'm going to need more time to monitor and review.