I want upload my local CSV file in my AWS S3 bucket
I tried:
s3 = boto3.resource('s3')
s3.meta.client.upload_file('verified_path_of_my_CSV', 'bucket_name', 'My_csv')
but I get a 0 byte CSV in my bucket
I also try:
s3_client = boto3.resource('s3')
bucket = s3.Bucket('bucket_name')
s3.Object(bucket, 'My_csv').put(Body=open(csv_path, 'rb'))
but I get this error:
Traceback (most recent call last):
File "test-text.py", line 109, in <module>
main()
File "test-text.py", line 88, in main
upload_csv(bucket, 'name_of_my_csv', 'My_path')
File "test-text.py", line 56, in upload_csv
s3.Object(bucket, 'My_csv').put(Body=open(csv_path, 'rb'))
...
File "/home/toto/.local/lib/python3.7/site-packages/botocore/handlers.py", line 219, in validate_bucket_name
if VALID_BUCKET.search(bucket) is None:
TypeError: expected string or bytes-like object
How can I do for upload my CSV in S3 ?
Thanks in advance
Solved by:
Actually when I made my CSV I write this:
with open('job_file.csv', 'w') as csvfile:
filewriter = csv.writer(csvfile, delimiter=',')
text = process_text_analysis(bucket, file, True)
filewriter.writerow(["doc1", text])
upload_csv(bucket, key_s3, 'my_local_csv_path')
But I exec upload_csv() before finished my CSV, the correct way is to move my upload function out of the block:
with open('job_file.csv', 'w') as csvfile:
filewriter = csv.writer(csvfile, delimiter=',')
text = process_text_analysis(bucket, file, True)
filewriter.writerow(["doc1", text])
upload_csv(bucket, key_s3, 'my_local_csv_path')
Stupid mistake I know :), but this append...
Hope this edit can avoid this type of question in the future ;)
Thanks to all of you, for try to solve my issue