Thanks for the heads on this thread.glbrent wrote:I don't know about you, but I'm still a little annoyed at this problem!!!! I would like to get to the heart of it somehow. But I think this is so much more of a s2member programming/internet explorer permissions issue. It's a weird one for sure. But not surprising considering how Microsoft likes to do "their own thing".
s2Member's default method of file delivery uses an advanced routine that delivers the file in chunks ( i.e.
Transfer-Encoding: chunked ). This method avoids most issues related to high memory consumption on large file downloads. Making it possible for large file downloads to succeed. If you are getting a 0kb file download, I suspect there is a conflict between your server configuration and the way s2Member delivers the download. This is a server-side issue, and can happen when/if your Apache configuration and/or PHP configuration is overriding the headers that s2Member sends; or conflicts with output compression, as described below.
There are three ways around this issue:1. Store your protected files inside an Amazon S3 Bucket, and configure s2Member to use your Amazon S3 account. This is the recommended method, because protected files are then stored at Amazon S3 for better security, better download speeds, and greater compatibility for VERY large files and/or media content. So, if you CAN use Amazon S3, please do, as that will be better all the way around, particularly if you plan to deliver VERY large files
( i.e. 50MB +, anything over 1GB absolutely ).
Why does s2Member need Amazon S3 to deliver VERY large files?
Well, if you're on a dedicated server where your hosting company is NOT limiting your script timeout period, you may choose not to use Amazon S3, and that's fine in most cases. However, most hosting companies impose limits on the amount of time that a script can run on your server. Since s2Member is a PHP plugin for WordPress, both powered by PHP, both WordPress and s2Member are subjected to the limits set forth by your hosting company. If you attempt to deliver a VERY large file that might take the Customer more than 30-60 seconds to download, your Customer may have trouble receiving the file, because it's being delivered through a PHP script ( i.e. subjected to a script timeout limitation ). The solution in this case, is to configure s2Member to use your Amazon S3 Bucket instead. When s2Member is configured to work with Amazon S3, there are no script timeout limits imposed at all.
2. Or, keep the files on your server, and disable s2Member's chunked file delivery in cases where your server/hosting configuration is in conflict with s2Member. You can do this by creating this directory and file:
/wp-content/mu-plugins/s2-hacks.php- Code: Select all
<?php
add_filter("ws_plugin__s2member_stream_file_downloads", "__return_false");
?>
3. Resolve the conflict with your server configuration. If you're on a dedicated server, this may not be difficult for you. However, if you're on shared hosting, you might want to consider options #1 or #2 above. Make sure your Apache configuration does not attempt to override PHP headers for Transfer-Encoding, and also be sure that your PHP installation does NOT use implicit flush, ob_gzhandler, or any type of PHP-based output compression. You can use output compression, but make sure you do that with Apache, and NOT with PHP, as that causes problems for file downloads. Sometimes the problem will become more apparent in a particular version of a browser which may not cope well in all scenarios.