70

I have a Node 4.3 Lambda function in AWS. I want to be able to write a text file to S3 and have read many tutorials about how to integrate with S3. However, all of them are about how to call Lambda functions after writing to S3.

How can I create a text file in S3 from Lambda using node? Is this possible? Amazons documentation doesn't seem to cover it.

5 Answers 5

105

Yes it is absolutely possible!

var AWS = require('aws-sdk');
function putObjectToS3(bucket, key, data){
    var s3 = new AWS.S3();
    var params = {
        Bucket : bucket,
        Key : key,
        Body : data
    };
    s3.putObject(params, function(err, data) {
        if (err) console.log(err, err.stack); // an error occurred
        else     console.log(data);           // successful response
    });
}

Make sure that you give your Lambda function the required write permissions to the target s3 bucket / key path by selecting or updating the IAM Role your lambda executes under.

IAM Statement to add:

{
    "Sid": "Stmt1468366974000",
    "Effect": "Allow",
    "Action": "s3:*",
    "Resource": [
        "arn:aws:s3:::my-bucket-name-goes-here/optional-path-before-allow/*"
    ]
}

Further reading:

Sign up to request clarification or add additional context in comments.

3 Comments

If your lambda function is executed inside a VPC you will have to create an endpoint for it. Before finding this out the callback of s3.putObject was never called. See this article about S3 VPC endpoint: aws.amazon.com/blogs/aws/new-vpc-endpoint-for-amazon-s3 See that one about accessing ressources from lambda: aws.amazon.com/blogs/aws/…
Could you specify how you would call this function? I'm guessing bucket comes from process.env.BUCKET_NAME, but what exactly is the key path, and data? What if you're writing the file in the Lambda first, and then putting it in a bucket?
As a late response to bildungsroman, key is anything that goes after your bucket name. For example, for https://mybucket.s3.amazonaws.com/path/to/image.jpg, the key is path/to/image.jpg.
10

After long long time of silence-failing of 'Task timed out after X' without any good error message, i went back to the beginning, to Amazon default template example, and that worked!

> Lambda > Functions > Create function > Use a blueprints > filter: s3.

Here is my tweaked version of amazon example:

const aws = require('aws-sdk');
const s3 = new aws.S3({ apiVersion: '2006-03-01' });

async function uploadFileOnS3(fileData, fileName){
    const params = {
        Bucket:  "The-bucket-name-you-want-to-save-the-file-to",
        Key: fileName,
        Body: JSON.stringify(fileData),
    };

    try {
        const response = await s3.upload(params).promise();
        console.log('Response: ', response);
        return response;

    } catch (err) {
        console.log(err);
    }
};

Comments

4

IAM Statement for serverless.com - Write to S3 to specific bucket

service: YOURSERVICENAME

provider:
  name: aws
  runtime: nodejs8.10
  stage: dev
  region: eu-west-1
  timeout: 60
  iamRoleStatements:
    - Effect: "Allow"
      Action:
       - s3:PutObject
      Resource: "**BUCKETARN**/*"
    - Effect: "Deny"
      Action:
        - s3:DeleteObject
      Resource: "arn:aws:s3:::**BUCKETARN**/*"

Comments

0

You can upload file on s3 using

aws-sdk

If you are using IAM user then you have to provide access key and secret key and make sure you have provided necessary permission to IAM user.

var AWS = require('aws-sdk');
AWS.config.update({accessKeyId: "ACCESS_KEY",secretAccessKey: 'SECRET_KEY'});
var s3bucket = new AWS.S3({params: {Bucket: 'BUCKET_NAME'}});
function uploadFileOnS3(fileName, fileData){
    var params = {
      Key: fileName,
      Body: fileData,
    };
    s3bucket.upload(params, function (err, res) {               
        if(err)
            console.log("Error in uploading file on s3 due to "+ err)
        else    
            console.log("File successfully uploaded.")
    });
}

Here I temporarily hard-coded AWS access and secret key for testing purposes. For best practices refer to the documentation.

2 Comments

Nothing in the question requests specifically for access keys and secrets to be used for authentication. It is extremely bad practice to authenticate in this way and as such I don't feel recommending it as a solution is wise. Lambda functions should be provisioned with an IAM Role that is sufficient for the access the Lambda Function requires to perform it's function.
I know i just put the secret key and access key hard coded but this method only for small personal scripts or for testing purposes.
0

One more option (export file as multipartFormFata): React > Node.js (AWS Lambda) > S3 Bucket https://medium.com/@mike_just_mike/aws-lambda-node-js-export-file-to-s3-4b35c400f484

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.