0

This question relates to an AWS lambda function which is invoked from another AWS lambda function and then in the invoked function I wanted to read a file and write the file back to S3. But, somehow it is not working. The details are as below:

I used a primary AWS Lambda function to invoke another Function with this code:

import json
import boto3

def lambda_handler(event, context):
    invokerLam = boto3.client("lambda") # Defining an object for the lambda service
    BlastFoldername = '1.047. 2020 S3C 5009 P1'
    inputsForInvocation = {'BlastFoldername' : BlastFoldername }


    response = invokerLam.invoke(FunctionName = 'arn:aws:lambda:us-east- 
                         1:131394402205:function:DestinationLambda', InvocationType = 'Event',
                                 Payload = json.dumps(inputsForInvocation))

The invoked AWS lambda function then does importing payload, reading a file from a bucket and writing the same file in the bucket, with the following code:

import json
import boto3
import io
import pandas as pd

def lambda_handler(event, context):

    # Reading the payload
    BlastFolder = event['BlastFoldername']
    NewBlastFolder = BlastFolder + "_transformed"
    FileName = "Collar Coordiantes 2820 S3C 5007 P1"
    LocationKey1 = NewBlastFolder + "/" + FileName



    #Read from the s3 Bucket with the same code
    s3 = boto3.client('s3')
    bucket_name = 'shivay.aws108'
    file_name = 'Transformed/Blast/Collar Coordinates.xls'
    file_cor = s3.get_object(Bucket = bucket_name, Key = file_name)
    file_content = file_cor['Body'].read()
    read_excel_data = io.BytesIO(file_content)
    df = pd.read_excel(read_excel_data)
    print(df)



    #Writing the data frame as CSV in the same S3 bucket
    csv_buffer = io.StringIO()
    df.to_csv(csv_buffer)
    s3_resource = boto3.resource('s3')
    HoleFile_save1 = "/Transformed/WritingBackCheck.csv"
    s3_path = HoleFile_save1
    s3_resource.Object(bucket_name, s3_path).put(Body = csv_buffer.getvalue())

The 3rd step of writing the file into s3 is not working. The reading of the file and the creation of the data frame is working though. This function working in other normal AWS Lambda functions where the Lambda is triggered by an object creation trigger but when it is triggered from another lambda, it is not writing a file into s3.

For the reference, the role for the first function is:


{
  "permissionsBoundary": {},
  "roleName": "trackore-lambda",
  "policies": [
    {
      "document": {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": "*"
          }
        ]
      },
      "name": "AmazonS3FullAccess",
      "id": "ANPAIFIR6V6BVTRAHWINE",
      "type": "managed",
      "arn": "arn:aws:iam::aws:policy/AmazonS3FullAccess"
    },
    {
      "document": {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Action": [
              "autoscaling:Describe*",
              "cloudwatch:*",
              "logs:*",
              "sns:*",
              "iam:GetPolicy",
              "iam:GetPolicyVersion",
              "iam:GetRole"
            ],
            "Effect": "Allow",
            "Resource": "*"
          },
          {
            "Effect": "Allow",
            "Action": "iam:CreateServiceLinkedRole",
            "Resource": "arn:aws:iam::*:role/aws-service-role/events.amazonaws.com/AWSServiceRoleForCloudWatchEvents*",
            "Condition": {
              "StringLike": {
                "iam:AWSServiceName": "events.amazonaws.com"
              }
            }
          }
        ]
      },
      "name": "CloudWatchFullAccess",
      "id": "ANPAIKEABORKUXN6DEAZU",
      "type": "managed",
      "arn": "arn:aws:iam::aws:policy/CloudWatchFullAccess"
    },
    {
      "document": {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Effect": "Allow",
            "Action": [
              "logs:CreateLogGroup",
              "logs:CreateLogStream",
              "logs:PutLogEvents"
            ],
            "Resource": "*"
          }
        ]
      },
      "name": "AWSLambdaBasicExecutionRole",
      "id": "ANPAJNCQGXC42545SKXIK",
      "type": "managed",
      "arn": "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
    },
    {
      "document": {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Effect": "Allow",
            "Action": [
              "lambda:InvokeFunction"
            ],
            "Resource": [
              "*"
            ]
          }
        ]
      },
      "name": "AWSLambdaRole",
      "id": "ANPAJX4DPCRGTC4NFDUXI",
      "type": "managed",
      "arn": "arn:aws:iam::aws:policy/service-role/AWSLambdaRole"
    }
  ],
  "trustedEntities": [
    "lambda.amazonaws.com"
  ]
}

and the role for the second function is:

{
  "permissionsBoundary": {},
  "roleName": "trackore-lambda",
  "policies": [
    {
      "document": {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": "*"
          }
        ]
      },
      "name": "AmazonS3FullAccess",
      "id": "ANPAIFIR6V6BVTRAHWINE",
      "type": "managed",
      "arn": "arn:aws:iam::aws:policy/AmazonS3FullAccess"
    },
    {
      "document": {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Action": [
              "autoscaling:Describe*",
              "cloudwatch:*",
              "logs:*",
              "sns:*",
              "iam:GetPolicy",
              "iam:GetPolicyVersion",
              "iam:GetRole"
            ],
            "Effect": "Allow",
            "Resource": "*"
          },
          {
            "Effect": "Allow",
            "Action": "iam:CreateServiceLinkedRole",
            "Resource": "arn:aws:iam::*:role/aws-service-role/events.amazonaws.com/AWSServiceRoleForCloudWatchEvents*",
            "Condition": {
              "StringLike": {
                "iam:AWSServiceName": "events.amazonaws.com"
              }
            }
          }
        ]
      },
      "name": "CloudWatchFullAccess",
      "id": "ANPAIKEABORKUXN6DEAZU",
      "type": "managed",
      "arn": "arn:aws:iam::aws:policy/CloudWatchFullAccess"
    },
    {
      "document": {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Effect": "Allow",
            "Action": [
              "logs:CreateLogGroup",
              "logs:CreateLogStream",
              "logs:PutLogEvents"
            ],
            "Resource": "*"
          }
        ]
      },
      "name": "AWSLambdaBasicExecutionRole",
      "id": "ANPAJNCQGXC42545SKXIK",
      "type": "managed",
      "arn": "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
    },
    {
      "document": {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Effect": "Allow",
            "Action": [
              "lambda:InvokeFunction"
            ],
            "Resource": [
              "*"
            ]
          }
        ]
      },
      "name": "AWSLambdaRole",
      "id": "ANPAJX4DPCRGTC4NFDUXI",
      "type": "managed",
      "arn": "arn:aws:iam::aws:policy/service-role/AWSLambdaRole"
    }
  ],
  "trustedEntities": [
    "lambda.amazonaws.com"
  ]
}

Please help.

Thanks

8
  • Any errors in CloudWatch logs? Commented May 6, 2020 at 4:36
  • @Marcin - no errors reported in the CloudWatch logs, it doesn't do anything when it reaches the writing back part, I tried printing something in the writing part to know its working, it doesn't print those lines either. Commented May 6, 2020 at 4:40
  • And the first function? Have you setup Dead-letter queue for the second function if you invoke it assynchroniusly? Commented May 6, 2020 at 4:44
  • sorry whats a dead letter queue? i just added permission in the role document as "Lambda: InvokeAsync" Commented May 6, 2020 at 4:47
  • lambda:InvokeAsync permission is deprecated. Please use lambda:InvokeFunction. Since you use async invokation (Event type), dlq can be useful. Commented May 6, 2020 at 4:51

1 Answer 1

1

Based on the chat conversation, it was found that the issue was timeout.

The default timeout of a second function was 3 seconds. It was too short. The solution was to increase the timeout.

Sign up to request clarification or add additional context in comments.

1 Comment

Use AWS SAM on your local machine for development. You can see this type of error in SAM terminal.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.