I am trying to create a Lambda function which will clean automatically csv files from an S3 bucket. The S3 bucket receives files every 5mn, and I have therefore created a trigger for the Lambda function. To clean the csv files I will use pandas library to create a dataframe. I have already installed a pandas layer. When creating a dataframe, there is an error message. This is my code:
import json
import boto3
import pandas as pd
from io import StringIO
#call s3 bucket
client = boto3.client('s3')
def lambda_handler(event, context):
#define bucket_name and object _name
bucket_name = event['Records'][0]['s3']['bucket']['name']
object_name = event['Records'][0]['s3']['object']['key']
#create a df from the object
df = pd.read_csv(object_name)
This is the error message:
[ERROR] FileNotFoundError: [Errno 2] No such file or directory: 'object_name'
On Cloudwatch it additionally says:
OpenBLAS WARNING - could not determine the L2 cache size on this system, assuming 256k
Has anyone experienced the same issues? Thanks in advance for all your help!
read_csv("object_name")- I hope you noticed that"object_name"is a string here and not the actual variable declared 2 lines above.