ChatGPT解决这个技术问题 Extra ChatGPT

Javascript to download a file from amazon s3 bucket?

I was trying to download a file from a bucket on Amazon S3. I was wondering if I can write a javascript to download such a file from a bucket. I was googling it, but couldn't find any resources that can help me do that.

Some steps in mind are: authenticate Amazon S3, then by providing bucket name, and file(key), download or read the file so that I can be able to display the data in the file.

Thanks,


G
G4bri3l

Maybe you can use AWS Node.js API:

var AWS = require('aws-sdk');
AWS.config.update(
  {
    accessKeyId: ".. your key ..",
    secretAccessKey: ".. your secret key ..",
  }
);
var s3 = new AWS.S3();
s3.getObject(
  { Bucket: "my-bucket", Key: "my-picture.jpg" },
  function (error, data) {
    if (error != null) {
      alert("Failed to retrieve an object: " + error);
    } else {
      alert("Loaded " + data.ContentLength + " bytes");
      // do something with data.Body
    }
  }
);

Here is the direct link to s3.getObject in AWS Node.js API for the interested.
For anyone that wants to know, data.Body is the raw file and can be used in other modules like graphicsmagick: gm(data.Body).identify(function (err, value) {}); Pretty cool!
isn't it { Bucket: "my-bucket", Key: "my-picture.jpg" } instead? (uppercase)
You can always save (download) the file like this in Node: fs.writeFile("mydata.tsv.gz", data.Body);
I surprised with alert in server side code on first eye :)
S
Sahith Vibudhi

I came here looking for away to download a s3 file on the client side. Here is how I solved it:

As, I can not store my s3 auth keys on client side, I used my server-side scripts to generate a pre-signed url and send it back to client like:

const AWS = require('aws-sdk')

const s3 = new AWS.S3()
AWS.config.update({accessKeyId: 'your access key', secretAccessKey: 'you secret key'})

const myBucket = 'bucket-name'
const myKey = 'path/to/your/key/file.extension'
const signedUrlExpireSeconds = 60 * 5 // your expiry time in seconds.

const url = s3.getSignedUrl('getObject', {
 Bucket: myBucket,
 Key: myKey,
 Expires: signedUrlExpireSeconds
})

// return the url to client

Use this URL in the front-end to trigger download:

function download(url){
    $('<iframe>', { id:'idown', src:url }).hide().appendTo('body').click();
}
$("#downloadButton").click(function(){
    $.ajax({
        url: 'example.com/your_end_point',
        success: function(url){
            download(url);
        }
    })
});

What exactly is a signed URL? Edit: TIL, advancedweb.hu/2018/10/30/s3_signed_urls
I found if I used the signed url in a new tab: window.open( signedUrl, '_blank' ); instead of the iframe, the browser could better handle mixed mime types in the downloaded files (binary, images, etc.), otherwise a very helpful answer.
You can read more about it here with a python example from aws : boto3.amazonaws.com/v1/documentation/api/latest/guide/…
@slh777 how can i use axios to download the file using that signedUrl.. ? Any ideas?
c
cssiamamess

Other answers here work, but wanted to expand on what worked for me.

In my case, I was dealing with files too large for

function download(url){
    $('<iframe>', { id:'idown', src:url }).hide().appendTo('body').click();
}

to work. ( Was getting url is too long ) My solution was to include a hidden anchor tag, and trigger the click to that tag on ajax success. You can't use the anchor tag right off the bat unless you don't care about handling errors.

S3 will respond with an XML error file if something goes wrong, so the browser will automatically display that XML response. By first attempting to hit the URL with ajax, you can catch that error without showing the ugly XML. On success in that ajax call is when you know you're clear to try and download the file.