3

I'm trying to write a lambda function that is triggered whenever a json file is uploaded to an s3 bucket. The function is supposed to parse the file and store it immediately in DynamoDB. I created a table called 'data' with the primary key set as 'date'. Here's what I have for the function so far:

import boto3
import json
s3_client = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')

def lambda_handler(event, context): 
   bucket = event['Records'][0]['s3']['bucket']['name']
   json_file_name = event['Records'][0]['s3']['object']['key']
   json_object = s3.Bucket(bucket).Object(json_file_name)
   jsonFileReader = json_object['Body'].read()
   jsonDict = json.loads(jsonFileReader)
   table = dynamodb.Table('data')
   table.put_item(Item = jsonDict)

Here is an example of a json file I'm trying to use:

{
 "date": "2020-06-07 21:00:34.284421",
 "ConfirmedCases": 7062067,
 "ActiveCases": 3206573,
 "RecoveredCases": 3450965,
 "Deaths": 404529
}

Unfortunately, whenever I test the code, it throws this error:

[[ERROR] TypeError: string indices must be integers
Traceback (most recent call last):
  File "/var/task/lambda_function.py", line 7, in lambda_handler
    bucket = event'Records'][0]['s3']['bucket']['name']]

Does anyone know how to resolve this issue? I've wasted so much time trying to figure this out and I still am unable to :/

0

1 Answer 1

2

Your error is from either of these lines.

bucket = event['Records'][0]['s3']['bucket']['name']
json_file_name = event['Records'][0]['s3']['object']['key']

However, they are correct. This is the valid way of access bucket name and object key from the event generated by S3 Notifications.

It seems to me that something else is trigger your function. Either you use Test option in the console and provide incorrect event object, or there is some other event that triggers the lambda with non-S3 event.

As a quick fix, you can do the following. The code below will finish if event object does not contain Records:

def lambda_handler(event, context): 

    if 'Records' not in event:
        print(event) # good to check what event you get  
        return

    # and the rest of code  

Sign up to request clarification or add additional context in comments.

3 Comments

@AmeerNaqvi no becuase you are getting the name correct in json_file_name. The error (KeyError) is about missing Records field in the event object. For some reason, your lambda is invoked with the event not having the Records field. Probably cause is that it is invoked by something else then s3.
@AmeerNaqvi The code you provided (lines 7 and 8) is correct. Can you update the question with the event structure that it causing the issue? Where is this event coming from, how does it look like?
@AmeerNaqvi This is incorrect event. This is not generated by S3 Notifications. Thus it fails.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.