1. <small id='s2Yiw'></small><noframes id='s2Yiw'>

      <i id='s2Yiw'><tr id='s2Yiw'><dt id='s2Yiw'><q id='s2Yiw'><span id='s2Yiw'><b id='s2Yiw'><form id='s2Yiw'><ins id='s2Yiw'></ins><ul id='s2Yiw'></ul><sub id='s2Yiw'></sub></form><legend id='s2Yiw'></legend><bdo id='s2Yiw'><pre id='s2Yiw'><center id='s2Yiw'></center></pre></bdo></b><th id='s2Yiw'></th></span></q></dt></tr></i><div id='s2Yiw'><tfoot id='s2Yiw'></tfoot><dl id='s2Yiw'><fieldset id='s2Yiw'></fieldset></dl></div>
    2. <legend id='s2Yiw'><style id='s2Yiw'><dir id='s2Yiw'><q id='s2Yiw'></q></dir></style></legend>
        <tfoot id='s2Yiw'></tfoot>

          <bdo id='s2Yiw'></bdo><ul id='s2Yiw'></ul>

        如何使用PHP下载大文件(内存占用低)

        时间:2023-08-20

            <tfoot id='8MvGD'></tfoot>
              <bdo id='8MvGD'></bdo><ul id='8MvGD'></ul>

            • <legend id='8MvGD'><style id='8MvGD'><dir id='8MvGD'><q id='8MvGD'></q></dir></style></legend>

                <small id='8MvGD'></small><noframes id='8MvGD'>

              • <i id='8MvGD'><tr id='8MvGD'><dt id='8MvGD'><q id='8MvGD'><span id='8MvGD'><b id='8MvGD'><form id='8MvGD'><ins id='8MvGD'></ins><ul id='8MvGD'></ul><sub id='8MvGD'></sub></form><legend id='8MvGD'></legend><bdo id='8MvGD'><pre id='8MvGD'><center id='8MvGD'></center></pre></bdo></b><th id='8MvGD'></th></span></q></dt></tr></i><div id='8MvGD'><tfoot id='8MvGD'></tfoot><dl id='8MvGD'><fieldset id='8MvGD'></fieldset></dl></div>
                  <tbody id='8MvGD'></tbody>
                  本文介绍了如何使用PHP下载大文件(内存占用低)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  我必须使用 PHP 下载大文件 (1xx MB).

                  I have to download big file (1xx MB) using PHP.

                  如何在不浪费内存 (RAM) 的情况下下载临时文件?

                  How can i download this without wasting memory (RAM) for temporary file ?

                  当我使用

                  $something=file_get_contents('http://somehost.example/file.zip');
                  file_put_contents($something,'myfile.zip');
                  

                  我需要有那个文件那么大的内存.

                  I need to have so much memory that size of that file.

                  也许可以使用任何其他方式下载它?

                  Maybe it's possible to download it using any other way ?

                  例如在部分(例如 1024b)中,写入磁盘,然后重复下载另一部分直到文件完全下载?

                  For example in parts (for example 1024b), write to disk, and download another part repeating until file will be fully downloaded ?

                  推荐答案

                  一次一小块地复制文件

                  /**
                   * Copy remote file over HTTP one small chunk at a time.
                   *
                   * @param $infile The full URL to the remote file
                   * @param $outfile The path where to save the file
                   */
                  function copyfile_chunked($infile, $outfile) {
                      $chunksize = 10 * (1024 * 1024); // 10 Megs
                  
                      /**
                       * parse_url breaks a part a URL into it's parts, i.e. host, path,
                       * query string, etc.
                       */
                      $parts = parse_url($infile);
                      $i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5);
                      $o_handle = fopen($outfile, 'wb');
                  
                      if ($i_handle == false || $o_handle == false) {
                          return false;
                      }
                  
                      if (!empty($parts['query'])) {
                          $parts['path'] .= '?' . $parts['query'];
                      }
                  
                      /**
                       * Send the request to the server for the file
                       */
                      $request = "GET {$parts['path']} HTTP/1.1
                  ";
                      $request .= "Host: {$parts['host']}
                  ";
                      $request .= "User-Agent: Mozilla/5.0
                  ";
                      $request .= "Keep-Alive: 115
                  ";
                      $request .= "Connection: keep-alive
                  
                  ";
                      fwrite($i_handle, $request);
                  
                      /**
                       * Now read the headers from the remote server. We'll need
                       * to get the content length.
                       */
                      $headers = array();
                      while(!feof($i_handle)) {
                          $line = fgets($i_handle);
                          if ($line == "
                  ") break;
                          $headers[] = $line;
                      }
                  
                      /**
                       * Look for the Content-Length header, and get the size
                       * of the remote file.
                       */
                      $length = 0;
                      foreach($headers as $header) {
                          if (stripos($header, 'Content-Length:') === 0) {
                              $length = (int)str_replace('Content-Length: ', '', $header);
                              break;
                          }
                      }
                  
                      /**
                       * Start reading in the remote file, and writing it to the
                       * local file one chunk at a time.
                       */
                      $cnt = 0;
                      while(!feof($i_handle)) {
                          $buf = '';
                          $buf = fread($i_handle, $chunksize);
                          $bytes = fwrite($o_handle, $buf);
                          if ($bytes == false) {
                              return false;
                          }
                          $cnt += $bytes;
                  
                          /**
                           * We're done reading when we've reached the conent length
                           */
                          if ($cnt >= $length) break;
                      }
                  
                      fclose($i_handle);
                      fclose($o_handle);
                      return $cnt;
                  }
                  

                  根据您的需要调整 $chunksize 变量.这只是经过了温和的测试.由于多种原因,它很容易损坏.

                  Adjust the $chunksize variable to your needs. This has only been mildly tested. It could easily break for a number of reasons.

                  用法:

                  copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg');
                  

                  这篇关于如何使用PHP下载大文件(内存占用低)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  上一篇:使用 PHP 限制下载速度 下一篇:允许用户在 webroot 之外下载文件

                  相关文章

                    <legend id='wyhOx'><style id='wyhOx'><dir id='wyhOx'><q id='wyhOx'></q></dir></style></legend>
                  1. <i id='wyhOx'><tr id='wyhOx'><dt id='wyhOx'><q id='wyhOx'><span id='wyhOx'><b id='wyhOx'><form id='wyhOx'><ins id='wyhOx'></ins><ul id='wyhOx'></ul><sub id='wyhOx'></sub></form><legend id='wyhOx'></legend><bdo id='wyhOx'><pre id='wyhOx'><center id='wyhOx'></center></pre></bdo></b><th id='wyhOx'></th></span></q></dt></tr></i><div id='wyhOx'><tfoot id='wyhOx'></tfoot><dl id='wyhOx'><fieldset id='wyhOx'></fieldset></dl></div>

                    <small id='wyhOx'></small><noframes id='wyhOx'>

                      <bdo id='wyhOx'></bdo><ul id='wyhOx'></ul>

                    1. <tfoot id='wyhOx'></tfoot>