1

I have a use-case to upload files on using S3 v4Signature. These uploads should be done using python script.

The S3 v4Signature is passed over to our clients to upload n number of files to our S3 bucket. Since we do not know the file name or count we cannot generate signed urls for them.

Consider this to be generated V4 signature. (For simplicty consider this signature us passed over to clients as hardcopy.)

key = "client/user_1/exec_1/",

s = {
  "url": "https://s3.amazonaws.com/package-bucket",
  "fields": {
    "acl": "public-read",
    "bucket": "package-bucket",
    "X-Amz-Algorithm": "AWS4-HMAC-SHA256",
    "X-Amz-Credential": "AKI****LV5P/20200115/us-east-1/s3/aws4_request",
    "X-Amz-Date": "20200115T183407Z",
    "Policy": "eyJleHBpcmF*******MTgzNDA3WiJ9XX0=",
    "X-Amz-Signature": "128c4************0c8d9",
  }
}

Now, this signature is required to be used by Python script to upload n number of files to client/user_1/exec_1/. I have been trying to implement this using requests but no luck.

with open('index.js', 'rb') as data:
  r = requests.put(s.get('url'), data=data, headers=s.get('fields'))
  print(r)

This python script is executed on the servers of our client. Giving them credentials to our s3 bucket is not possible.

Folders in our s3 would look like

client/user_1/exec_1
client/user_1/exec_2
client/user_2/exec_1
client/user_2/exec_2
client/user_2/exec_3
client/user_2/exec_4

Once the files are uploaded, we process them on our servers.

Note: Frontend is not involved in the whole process.

Can anyone suggest a function to upload file(s) to s3?


Function used to generate signature.

function singleFilePublicUpload(key) {
  console.log("singleFilePublicUpload | Signature Request", key);
  const Bucket = 'package-bucket';
  const acl = "public-read";
  const params = { 
    Bucket, Fields: { acl }, Conditions: [['starts-with', '$key', key]]
  };
  const presignedPost = s3.createPresignedPost(params);
  presignedPost.fields.key = key;
  return presignedPost;
}

FAQ

Q: Where is the Python script being run?

A: Python Upload script runs on different client servers.

Q: Is it being run by you, or a user or customer of yours?

A: Python Upload script is run by clients.

Q: Or is it the backend of a website that your users will access?

A: No. Website is not involved in the process.

Q: How do you want them to authenticate to prove that they are permitted to upload to S3?

A: We want to give them access to S3 folder using wildcard signature.

Q: You say that you do not wish to use boto because it needs AWS permissions, so how do you intend to authorize the upload to S3?

A: We want our client to submit POST request directly to S3 using provided signature. V4 Signature should authorize Direct Upload to S3.

6
  • Is there a particular reason that you are not using the boto3 library? (Aside from potentially not knowing it existed?) Commented Jan 15, 2020 at 20:27
  • @JohnRotenstein. as boto needs AWS permissions. for security reasons we are trying not to use boto. Commented Jan 16, 2020 at 5:57
  • Could you please edit your question to tell us more about your use-case? For example, where is the Python script being run? Is it being run by you, or a user or customer of yours? Or is it the backend of a website that your users will access? How do you want them to authenticate to prove that they are permitted to upload to S3? You say that you do not wish to use boto because it needs AWS permissions, so how do you intend to authorize the upload to S3? (Please add this information to your question rather than answering via a comment.) Commented Jan 17, 2020 at 0:00
  • @JohnRotenstein I have added more details. Commented Jan 17, 2020 at 10:26
  • 1
    @Exelian Multiple reasons, 1. IAM comes under infra managed by terraform, can do it for 1000+ users. each user has multiple folders in the same bucket one for each process. We dont want them to mess it up for themselves. Commented Jan 17, 2020 at 10:49

2 Answers 2

1

A recommended architecture for this is:

  • Do not use IAM credentials for your users. They are only for your own internal use.
  • Have users authenticate to your web application. The application verifies their identity (using Cognito, or however you wish).
  • The application then presents a page where they can Upload a file using a pre-signed URL. This is a time-limited URL that will permit an upload within given constraints (eg where to store it, what name to use).

This is a secure means of only authorizing authenticated users to upload.

See: Uploading Objects Using Presigned URLs - Amazon Simple Storage Service

Variation for Python

If you specifically wish them to upload via Python, you could:

  • Have the Python app (running on client site) authenticate with your app
  • The app can then use the AWS Security Token Service (STS) to create temporary credentials with permission to upload to Amazon S3
  • The Python app then uses those credentials with boto3 to upload the file

This method also ensures that only authenticated users can upload to the bucket.

Bottom line: You can't provide a signature because that varies by the file being uploaded. However, you can provide temporary credentials or a pre-signed URL to permit the upload.

Sign up to request clarification or add additional context in comments.

1 Comment

This is the perfect answer, AWS Security Token Service is the answer to our problem. thanks a lot.
1

You can use boto3 python library:

import boto3

client = boto3.client('s3')
response = client.put_object(Bucket='your_bucket', Key='your_key')

1 Comment

1st statement was, I have a use-case to upload files on using S3 v4Signature. I don't need a boto based solution.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.