tucstore Posted July 13, 2021 Author Share Posted July 13, 2021 moving forward how do we solve this ? Link to comment Share on other sites More sharing options...
bsmither Posted July 13, 2021 Share Posted July 13, 2021 I see, as a minimum, a few things that could be done to accommodate the outlier situation of wanting to offer super-huge downloadable files: eliminate the use of md5_file() function - it relies on reading the file from the disk change the CubeCart_filemanager 'filesize' column to "BIGINT 12" for all installations, make a strong detailed advisory at setup that there are practical limits to the number of products, filesizes, etc, that are based on real-world laws of physics that are associated with the server environment Link to comment Share on other sites More sharing options...
tucstore Posted July 13, 2021 Author Share Posted July 13, 2021 files are getting bigger and bigger each day it seems very 1997 to have a max of 4gb etc how do i fix this the only other option is to upload it into public files section but it leaves me wide open for people to hack my site and take all my digital files and also runs the risk of the link to be shared worlwide for everyone to have a download instead of 1 per person Link to comment Share on other sites More sharing options...
bsmither Posted July 13, 2021 Share Posted July 13, 2021 We managed to work around the immediate obstacles (basically, the web server timeout). With CubeCart's recent ability to "stream" a product's digital file, there is now a reasonable use case to accommodate large files. A "Feature Request" can be made in CubeCart's GitHub asking to re-code CubeCart to do this. Link to comment Share on other sites More sharing options...
bsmither Posted July 13, 2021 Share Posted July 13, 2021 Allow me to ask, how long did it take you to FTP the 13GiB file to your site? Link to comment Share on other sites More sharing options...
tucstore Posted July 13, 2021 Author Share Posted July 13, 2021 i uploaded it via ftp with filezilla took me 4 hours Link to comment Share on other sites More sharing options...
bsmither Posted July 14, 2021 Share Posted July 14, 2021 CubeCart's /files/ folder does have a /public/ subfolder, and has an .htaccess file that allows the web server permission to deliver all requested files. But the /files/ folder itself has an .htaccess file that denies the web server permission to deliver any requested file (except for a couple of situations specific to CubeCart's needs). If you haven't read it yet, please review the GitHub issue: https://github.com/cubecart/v6/issues/2898 The point about not calculating the hash can be put into effect in the file /classes/filemanager.class.php, near line 938: From: 'md5hash' => md5_file($product[0]['digital_path']), To: // 'md5hash' => md5_file($product[0]['digital_path']), So, with the previously discussed edits and web server configuration change, we should be getting to a point where offering super-large files is workable. Also, when using the special link created by CubeCart (the one with "accesskey=randomchars"), CubeCart does make sure that the file to be downloaded under CubeCart control passes restrictions imposed: time limit and download limit. Link to comment Share on other sites More sharing options...
tucstore Posted July 14, 2021 Author Share Posted July 14, 2021 ok so just to be clear please do give the entire fix in one thread so i can edit all files needed thanks Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.