ChatGPT解决这个技术问题 Extra ChatGPT

Download File to server from URL

Well, this one seems quite simple, and it is. All you have to do to download a file to your server is:

file_put_contents("Tmpfile.zip", file_get_contents("http://someurl/file.zip"));

Only there is one problem. What if you have a large file, like 100mb. Then, you will run out of memory, and not be able to download the file.

What I want is a way to write the file to the disk as I am downloading it. That way, I can download bigger files, without running into memory problems.

That's set in your server configuration, PHP can't really get around it as far as I know (except for a direct .ini edit)

C
Community

Since PHP 5.1.0, file_put_contents() supports writing piece-by-piece by passing a stream-handle as the $data parameter:

file_put_contents("Tmpfile.zip", fopen("http://someurl/file.zip", 'r'));

From the manual:

If data [that is the second argument] is a stream resource, the remaining buffer of that stream will be copied to the specified file. This is similar with using stream_copy_to_stream().

(Thanks Hakre.)


That wouldn't be my first choice. If allow_fopen_url Off is set in php.ini (good idea for security), your script would be broken.
@idealmachine I think file_get_contents() would not work either if that were the case (see OP).
@geoff I was specific, I mentioned the function you wanted. What you may have wanted was someone to write the code for you - but I'm sure you learned something doing it yourself. Also, if we are going to comment on each other's SO interactions - please accept some more answers :)
@alex: Please see the edit, feel free to incorporate. let me know when I can remove this comment here then.
The 'b' flag should also be used in most cases with fopen; prevents adverse effects to images and other non plain text files.
I
Ibrahim Azhar Armar
private function downloadFile($url, $path)
{
    $newfname = $path;
    $file = fopen ($url, 'rb');
    if ($file) {
        $newf = fopen ($newfname, 'wb');
        if ($newf) {
            while(!feof($file)) {
                fwrite($newf, fread($file, 1024 * 8), 1024 * 8);
            }
        }
    }
    if ($file) {
        fclose($file);
    }
    if ($newf) {
        fclose($newf);
    }
}

thank's for your snippit, but would you be able to explain your code @xaav? I'm not exactly brilliant at php. What is 1024*8 for ? Thank's again.
@wMINOw The length of the line.
Specifically, it means to read up to 8KB at a time (1024 bytes per KB * 8) since the parameter is in bytes. As long as the line is <= 8KB, it will read the entire line at once.
Why is not this the best answer?
How do you handle errors with this approach? What if a 404 is returned or the connection is interrupted or times out?
C
Community

Try using cURL

set_time_limit(0); // unlimited max execution time
$options = array(
  CURLOPT_FILE    => '/path/to/download/the/file/to.zip',
  CURLOPT_TIMEOUT =>  28800, // set this to 8 hours so we dont timeout on big files
  CURLOPT_URL     => 'http://remoteserver.com/path/to/big/file.zip',
);

$ch = curl_init();
curl_setopt_array($ch, $options);
curl_exec($ch);
curl_close($ch);

I'm not sure but I believe with the CURLOPT_FILE option it writes as it pulls the data, ie. not buffered.


Normally, this would be fine, but I have this code in a web app, so I cant be sure users will have cURL installed. However, I did give this a vote up.
@Geoff is it a distributed web app? Because if you control the hosting, then it doesn't matter about your users (cURL is a library on your server).
No. I do not control hosting. It is a distributed web app that anyone could have.
Curl might be missing. But almost all shared hosting companies have CURL installed by default. I mean, I haven't seen one that doesn't.
As from my tests, you can't assign to CURLOPT_FILE a file path directly. It has to be a file handler. First, open the file with $fh = fopen('/path/to/download/the/file/to.zip', 'w'); and close with fclose($fh); after curl_close($ch);. And set CURLOPT_FILE => $fh
K
Kamil Kiełczewski

prodigitalson's answer didn't work for me. I got missing fopen in CURLOPT_FILE more details.

This worked for me, including local urls:

function downloadUrlToFile($url, $outFileName)
{   
    if(is_file($url)) {
        copy($url, $outFileName); 
    } else {
        $options = array(
          CURLOPT_FILE    => fopen($outFileName, 'w'),
          CURLOPT_TIMEOUT =>  28800, // set this to 8 hours so we dont timeout on big files
          CURLOPT_URL     => $url
        );

        $ch = curl_init();
        curl_setopt_array($ch, $options);
        curl_exec($ch);
        $httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
        curl_close($ch);
        return $httpcode;
    }
}

        

C
Community

Create a folder called "downloads" in destination server Save [this code] into .php file and run in destination server

Downloader :

<html>
<form method="post">
<input name="url" size="50" />
<input name="submit" type="submit" />
</form>
<?php
    // maximum execution time in seconds
    set_time_limit (24 * 60 * 60);

    if (!isset($_POST['submit'])) die();

    // folder to save downloaded files to. must end with slash
    $destination_folder = 'downloads/';

    $url = $_POST['url'];
    $newfname = $destination_folder . basename($url);

    $file = fopen ($url, "rb");
    if ($file) {
      $newf = fopen ($newfname, "wb");

      if ($newf)
      while(!feof($file)) {
        fwrite($newf, fread($file, 1024 * 8 ), 1024 * 8 );
      }
    }

    if ($file) {
      fclose($file);
    }

    if ($newf) {
      fclose($newf);
    }
?>
</html> 

This assumes the user wants a standalone script rather than a solution that will work within an existing PHP application, and I believe the latter is what the OP and most others are looking for. An explanation would also be helpful for people who want to understand the approach.
whenever I try this always my transferred file size is 50816 but my file size is bigger than this.. 120MB.. Any idea why is this?
set_time_limit (24 * 60 * 60); has to be put inside a loop. It has no effect at the beginning of the script.
V
Vijaysinh Parmar
set_time_limit(0); 
$file = file_get_contents('path of your file');
file_put_contents('file.ext', $file);

your answer is very simple and good working, helped me where cURL failed to get file, this worked. Thanks :)
You might want to explain what this actually does.
This does not address the OP's problem of exceeding the PHP memory limit.
This is pretty simple and straightforward. Quite useful for simpler cases where the files are small or the the environment is a local development.
any idea for .xlsx files ? It's storing empty file with 0 byte memory .
P
Pradeep Kumar

Use a simple method in php copy()

copy($source_url, $local_path_with_file_name);

Note: if the destination file already exists, it will be overwritten

PHP copy() Function

Note: You need to set permission 777 for the destination folder. Use this method when you are downloading to your local machine.

Special Note: 777 is a permission in Unix based system with full read/write/execute permission to owner, group and everyone. In general we give this permission to assets which are not much needed to be hidden from public on a web server. Example: images folder.


I will never never never set 777 as perms on a webserver, and I will kick off any webdeveloper whom has the bad idea to do that. Every time, everywhere. Be carefull ! You can not do that ! Think about security. Following OWASP rules is not enough. Having good thinking about simple things matters.
@ThierryB. Note: I've given local path. & this can be used in internal applications. Having good reading and understanding of question and answer matters. Think different scenarios. And this is not accepted/best answer. Every question has different answers with pros & cons in it.. Example for you to understand: Even Fibonacci have multiple unique solutions where only one will be best. Others will be used in different scenarios.
Ok, but taking time to think about best practises and implement them inside secured places will give you a better understanding of concepts you must implement. Maybe if an intruder is inside your ($)home, doing some traps or building things the best way you can will give him some headaches ;)
H
Hoan Huynh

There are 3 ways:

file_get_contents and file_put_contents CURL fopen

You can find examples from here.


H
Hoàng Vũ Tgtt

I use this to download file

function cURLcheckBasicFunctions()
{
  if( !function_exists("curl_init") &&
      !function_exists("curl_setopt") &&
      !function_exists("curl_exec") &&
      !function_exists("curl_close") ) return false;
  else return true;
}

/*
 * Returns string status information.
 * Can be changed to int or bool return types.
 */
function cURLdownload($url, $file)
{
  if( !cURLcheckBasicFunctions() ) return "UNAVAILABLE: cURL Basic Functions";
  $ch = curl_init();
  if($ch)
  {

    $fp = fopen($file, "w");
    if($fp)
    {
      if( !curl_setopt($ch, CURLOPT_URL, $url) )
      {
        fclose($fp); // to match fopen()
        curl_close($ch); // to match curl_init()
        return "FAIL: curl_setopt(CURLOPT_URL)";
      }
      if ((!ini_get('open_basedir') && !ini_get('safe_mode')) || $redirects < 1) {
        curl_setopt($ch, CURLOPT_USERAGENT, '"Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.11) Gecko/20071204 Ubuntu/7.10 (gutsy) Firefox/2.0.0.11');
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
        //curl_setopt($ch, CURLOPT_REFERER, 'http://domain.com/');
        if( !curl_setopt($ch, CURLOPT_HEADER, $curlopt_header)) return "FAIL: curl_setopt(CURLOPT_HEADER)";
        if( !curl_setopt($ch, CURLOPT_FOLLOWLOCATION, $redirects > 0)) return "FAIL: curl_setopt(CURLOPT_FOLLOWLOCATION)";
        if( !curl_setopt($ch, CURLOPT_FILE, $fp) ) return "FAIL: curl_setopt(CURLOPT_FILE)";
        if( !curl_setopt($ch, CURLOPT_MAXREDIRS, $redirects) ) return "FAIL: curl_setopt(CURLOPT_MAXREDIRS)";

        return curl_exec($ch);
    } else {
        curl_setopt($ch, CURLOPT_USERAGENT, '"Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.11) Gecko/20071204 Ubuntu/7.10 (gutsy) Firefox/2.0.0.11');
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
        //curl_setopt($ch, CURLOPT_REFERER, 'http://domain.com/');
        if( !curl_setopt($ch, CURLOPT_FOLLOWLOCATION, false)) return "FAIL: curl_setopt(CURLOPT_FOLLOWLOCATION)";
        if( !curl_setopt($ch, CURLOPT_FILE, $fp) ) return "FAIL: curl_setopt(CURLOPT_FILE)";
        if( !curl_setopt($ch, CURLOPT_HEADER, true)) return "FAIL: curl_setopt(CURLOPT_HEADER)";
        if( !curl_setopt($ch, CURLOPT_RETURNTRANSFER, true)) return "FAIL: curl_setopt(CURLOPT_RETURNTRANSFER)";
        if( !curl_setopt($ch, CURLOPT_FORBID_REUSE, false)) return "FAIL: curl_setopt(CURLOPT_FORBID_REUSE)";
        curl_setopt($ch, CURLOPT_USERAGENT, '"Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.11) Gecko/20071204 Ubuntu/7.10 (gutsy) Firefox/2.0.0.11');
    }
      // if( !curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true) ) return "FAIL: curl_setopt(CURLOPT_FOLLOWLOCATION)";
      // if( !curl_setopt($ch, CURLOPT_FILE, $fp) ) return "FAIL: curl_setopt(CURLOPT_FILE)";
      // if( !curl_setopt($ch, CURLOPT_HEADER, 0) ) return "FAIL: curl_setopt(CURLOPT_HEADER)";
      if( !curl_exec($ch) ) return "FAIL: curl_exec()";
      curl_close($ch);
      fclose($fp);
      return "SUCCESS: $file [$url]";
    }
    else return "FAIL: fopen()";
  }
  else return "FAIL: curl_init()";
}

E
Eric Leroy

A PHP 4 & 5 Solution:

readfile() will not present any memory issues, even when sending large files, on its own. A URL can be used as a filename with this function if the fopen wrappers have been enabled.

http://php.net/manual/en/function.readfile.php


This does not answer the question, because the question is about writing on disk not to the output buffer.
w
webHasan

Simple solution:

<?php 
exec('wget http://someurl/file.zip');

@Netwons make sure wget available in your server.
wget available to system error ======> errorCode=1 SSL/TLS handshake failure: The TLS connection was non-properly terminated.
or error ======> Connecting to www.you.com (www.you.com)|178.79.180.188|:443... connected.
N
Netwons

best solution

install aria2c in system &

 echo exec("aria2c \"$url\"")