Guest James Foster Posted August 14, 2007 Share Posted August 14, 2007 Okay, so here's my issue - I'm linking to my downloads using the root path: /home/mydomain/downloads/mydownloadablemovie.zip I have a total of five download files there. SOME of them work fine with the generated link that customers receive - and SOME of them result in a page not found error. I've checked, double-checked, and re-checked all five to make sure the root path was entered correctly, and to make sure that all file names are absolutely correct. Because some of them work, we know the root path is fine. What in the WORLD am I missing? This is driving me crazy! Quote Link to comment Share on other sites More sharing options...
Guest James Foster Posted August 14, 2007 Share Posted August 14, 2007 More information - the files that work are around 300mb (these are movies people are downloading) - the files that don't work are all over 450mb - could the issue somehow be related to a file size issue? Okay, so here's my issue - I'm linking to my downloads using the root path: /home/mydomain/downloads/mydownloadablemovie.zip I have a total of five download files there. SOME of them work fine with the generated link that customers receive - and SOME of them result in a page not found error. I've checked, double-checked, and re-checked all five to make sure the root path was entered correctly, and to make sure that all file names are absolutely correct. Because some of them work, we know the root path is fine. What in the WORLD am I missing? This is driving me crazy! Quote Link to comment Share on other sites More sharing options...
bsmither Posted August 14, 2007 Share Posted August 14, 2007 Sorry, don't run Unix-style servers myself, so if the issue isn't related to absolute correctness of filenames, including case-sensitivity, then I'm guessing there must be a web server setting that throttles or denies downloadable files based on sizes. (If I recall, there is such a setting in IIS6 servers, but I might not be recalling correctly.) If you are the sysadmin for the server where your store is located, make an inquiry in the server support forums. If not, then contact your host and ask them if there's any throttling going on. Quote Link to comment Share on other sites More sharing options...
Guest James Foster Posted August 14, 2007 Share Posted August 14, 2007 Sorry, don't run Unix-style servers myself, so if the issue isn't related to absolute correctness of filenames, including case-sensitivity, then I'm guessing there must be a web server setting that throttles or denies downloadable files based on sizes. (If I recall, there is such a setting in IIS6 servers, but I might not be recalling correctly.) If you are the sysadmin for the server where your store is located, make an inquiry in the server support forums. If not, then contact your host and ask them if there's any throttling going on. That's a good guess - and I briefly considered that myself, but I've had downloads up to 600mb on this server before using a difference shopping cart system (oscommerce) - so unless this is something new that they're doing and not telling anyone about, I don't think that's it. Quote Link to comment Share on other sites More sharing options...
Guest James Foster Posted August 14, 2007 Share Posted August 14, 2007 Even more information - checking things first thing this morning, and checking with customers - it seems that the links generated by cubecart tend to work fine early in the day - but towards the end of the day is when people are getting "page can't be found" errors for the *exact same link* What gives? Quote Link to comment Share on other sites More sharing options...
bsmither Posted August 14, 2007 Share Posted August 14, 2007 Again, have you contacted your host (if it's not you)? There may be something new that they are not telling you. If it works in the morning but not later in the day, maybe you are up against your (sliding) monthly payload (served bytes) limit. Just a wild guess. Quote Link to comment Share on other sites More sharing options...
bsmither Posted August 15, 2007 Share Posted August 15, 2007 In the file download.php, near line 44, we see that here begins the function to actually start pumping the file out of the repository and down to the browser. If you are putting your files in a non-public space (which you are), we see that line 62 is where we start building headers. Then the file is read into a buffer in 1MB chunks and then "printed" (presumably to the output stream). In your server's PHP .INI file, there is a setting called "max_execution_time" for scripts. *Maybe* in the afternoon, the server is so busy with other tasks that it takes too long to process large files within the time limit. Just a SWAG. Click the Server Info in your Admin page to see what your setting is. Now, to further experiment, place a large file in a publicly accessible place and create an ftp: or http: link to it in the digital download field of the product. CubeCart doesn't 'manage' the delivery of the file in this case. See what happens. Quote Link to comment Share on other sites More sharing options...
Guest James Foster Posted August 15, 2007 Share Posted August 15, 2007 In the file download.php, near line 44, we see that here begins the function to actually start pumping the file out of the repository and down to the browser. If you are putting your files in a non-public space (which you are), we see that line 62 is where we start building headers. Then the file is read into a buffer in 1MB chunks and then "printed" (presumably to the output stream). In your server's PHP .INI file, there is a setting called "max_execution_time" for scripts. *Maybe* in the afternoon, the server is so busy with other tasks that it takes too long to process large files within the time limit. Just a SWAG. Click the Server Info in your Admin page to see what your setting is. Now, to further experiment, place a large file in a publicly accessible place and create an ftp: or http: link to it in the digital download field of the product. CubeCart doesn't 'manage' the delivery of the file in this case. See what happens. I think you've found the issue! It looks like 30 seconds is the max ececution time. I've moved the files into a publicly accessible spot, and changed the link in the dd field, AND crossed my fingers! I'll report back to let you know how it goes. thanks! Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.