我在服务器上有一个 php 脚本可以将文件发送给配方:他们获得一个唯一的链接,然后他们可以下载大文件.有时传输会出现问题,文件已损坏或无法完成.我想知道是否有更好的方法来发送大文件
I have a php script on a server to send files to recipents: they get a unique link and then they can download large files. Sometimes there is a problem with the transfer and the file is corrupted or never finishes. I am wondering if there is a better way to send large files
代码:
$f = fopen(DOWNLOAD_DIR.$database[$_REQUEST['fid']]['filePath'], 'r');
while(!feof($f)){
print fgets($f, 1024);
}
fclose($f);
我见过诸如
http_send_file
http_send_data
但我不确定它们是否会起作用.
But I am not sure if they will work.
解决此问题的最佳方法是什么?
What is the best way to solve this problem?
问候
埃尔温
分块文件是 PHP 中最快/最简单的方法,如果你不能或不想使用像 cURL, mod-xsendfile
在 Apache 或一些 专用脚本.
Chunking files is the fastest / simplest method in PHP, if you can't or don't want to make use of something a bit more professional like cURL, mod-xsendfile
on Apache or some dedicated script.
$filename = $filePath.$filename;
$chunksize = 5 * (1024 * 1024); //5 MB (= 5 242 880 bytes) per one chunk of file.
if(file_exists($filename))
{
set_time_limit(300);
$size = intval(sprintf("%u", filesize($filename)));
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$size);
header('Content-Disposition: attachment;filename="'.basename($filename).'"');
if($size > $chunksize)
{
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
print(@fread($handle, $chunksize));
ob_flush();
flush();
}
fclose($handle);
}
else readfile($path);
exit;
}
else echo 'File "'.$filename.'" does not exist!';
移植自 richnetapps.com/NeedBee.对 200 MB 文件进行了测试,其中 readfile()
死亡,即使最大允许内存限制设置为 1G
,这是下载文件大小的五倍.
Ported from richnetapps.com / NeedBee. Tested on 200 MB files, on which readfile()
died, even with maximum allowed memory limit set to 1G
, that is five times more than downloaded file size.
顺便说一句:我也在文件 >2GB
上测试了这个,但 PHP 只设法写入了第一个 2GB
文件,然后断开了连接.与文件相关的函数(fopen、fread、fseek)使用 INT,因此您最终会达到 2GB
的限制.在这种情况下,上述解决方案(即 mod-xsendfile
)似乎是唯一的选择.
BTW: I tested this also on files >2GB
, but PHP only managed to write first 2GB
of file and then broke the connection. File-related functions (fopen, fread, fseek) uses INT, so you ultimately hit the limit of 2GB
. Above mentioned solutions (i.e. mod-xsendfile
) seems to be the only option in this case.
编辑:让自己 100% 将文件保存在 utf-8
中.如果省略,下载的文件将被损坏.这是因为该解决方案使用 print
将文件块推送到浏览器.
EDIT: Make yourself 100% that your file is saved in utf-8
. If you omit that, downloaded files will be corrupted. This is, because this solutions uses print
to push chunk of a file to a browser.
这篇关于用 PHP 可靠地下载大文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!