ChatGPT解决这个技术问题 Extra ChatGPT

Amazon S3 Permission problem - How to set permissions for all files at once?

I have uploaded some files using the Amazon AWS management console.

I got an HTTP 403 Access denied error. I found out that I needed to set the permission to view.

How would I do that for all the files on the bucket?

I know that it is possible to set permission on each file, but it's time-consuming when having many files that need to be viewable for everyone.


G
GabLeRoux

I suggest that you apply a bucket policy1 to the bucket where you want to store public content. This way you don't have to set ACL for every object. Here is an example of a policy that will make all the files in the bucket mybucket public.

{
    "Version": "2008-10-17",
    "Id": "http better policy",
    "Statement": [
        {
            "Sid": "readonly policy",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::mybucket/sub/dirs/are/supported/*"
        }
    ]
}

That * in "Resource": "arn:aws:s3:::mybucket/sub/dirs/are/supported/*" allows recursion.

1 Note that a Bucket Policy is different than an IAM Policy. (For one you will get an error if you try to include Principal in an IAM Policy.) The Bucket Policy can be edit by going the root of the bucket in your AWS web console and expanding Properties > Permissions. Subdirectories of a bucket also have Properties > Permissions, but there is no option to Edit bucket policy


Just in case anyone else gets caught out by this, the Version has to be exactly as specified.
Just in case anyone is wondering how to set a bucket policy, here is the official AWS documentation: docs.amazonwebservices.com/AmazonS3/latest/dev/…
They also have a useful tool for generating these: awspolicygen.s3.amazonaws.com/policygen.html
S3 Browser is a UI toolkit that can do his. It has a freeware version that I think should do what you need. s3browser.com we use their paid version (its like $30) and I know it does all these things in a couple clicks for quick tasks where you don't fee like coding
I tried copy pasting this into the S3 console. It complains that it's not valid json, you have to remove the trailing comma after the "Resource" line and the initial space before the first {.
G
GhostCat

You can select which directory you want it to be public.

Press on "more" and mark it as public; it will make the directory and all the files to be accessible as public.


D
Daniel García

You can only modify ACLs for a unique item (bucket or item), soy you will have to change them one by one.

Some S3 management applications allows you to apply the same ACL to all items in a bucket, but internally, it applies the ACL to each one by one.

If you upload your files programmatically, it's important to specify the ACL as you upload the file, so you don't have to modify it later. The problem of using an S3 management application (like Cloudberry, Transmit, ...) is that most of them uses the default ACL (private read only) when you upload each file.


P
Paul Siersma

Using S3 Browser you can update permissions using the gui, also recursively. It's a useful tool and free for non-commercial use.


F
Finesse

To make a bulk of files public, do the following:

Go to S3 web interface Open the required bucket Select the required files and folders by clicking the checkboxes at the left of the list Click «More» button at the top of the list, click «Make public» Confirm by clicking «Make public». The files won't have a public write access despite the warning says «...read this object, read and write permissions».


m
moppag

You could set ACL on each file using aws cli:

BUCKET_NAME=example
BUCKET_DIR=media
NEW_ACL=public-read

aws s3 ls $BUCKET_NAME/$BUCKET_DIR/ | \
awk '{$1=$2=$3=""; print $0}' | \
xargs -t -I _ \
aws s3api put-object-acl --acl $NEW_ACL --bucket $BUCKET_NAME --key "$BUCKET_DIR/_"

This solution is good in case the files have bad properties set on them directly :) One could also write something similar in python using boto3's ObjectAcl or more precisely S3.ObjectAcl.put which would lead to something easier to read.
G
GabLeRoux

I had same problem while uploading the files through program (java) to s3 bucket ..

Error: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:9000' is therefore not allowed access. The response had HTTP status code 403

I added the origin identity and changed the bucket policy and CORS configuration then everything worked fine.


J
Joshua Pinter

Transmit 5

I wanted to add this here for potential macOS users that already have the beautifully-crafted FTP app called Transmit by Panic.

I already had Panic and it supports S3 buckets (not sure what version this came in but I think the upgrades were free). It also supports recursively updating Read and Write permissions.

You simply right click the directory you want to update and select the Read and Write permissions you want to set them to.

It doesn't seem terribly fast but you can open up the log file by going Window > Transcript so you at least know that it's doing something.


G
GabLeRoux

Use AWS policy generator to generate a policy which fits your need. The principal in the policy generator should be the IAM user/role which you'd be using for accessing the object(s). Resource ARN should be arn:aws:s3:::mybucket/sub/dirs/are/supported/*

Next, click on "Add statement" and follow through. You'll finally get a JSON representing the policy. Paste this in your s3 bucket policy management section which is at "your s3 bucket page in AWS -> permissions -> bucket policy".