0

I am reading a csv file and want to write multiple lines at once to DynamoDB. Is there a way to do this. I have found an example in AWS documentation but it doesn't serve the purpose at we need to do it manually.

private static void testBatchSave(DynamoDBMapper mapper) {

Book book1 = new Book();
book1.id = 901;
book1.inPublication = true;
book1.ISBN = "902-11-11-1111";
book1.pageCount = 100;
book1.price = 10;
book1.productCategory = "Book";
book1.title = "My book created in batch write";

Book book2 = new Book();
book2.id = 902;
book2.inPublication = true;
book2.ISBN = "902-11-12-1111";
book2.pageCount = 200;
book2.price = 20;
book2.productCategory = "Book";
book2.title = "My second book created in batch write";

Book book3 = new Book();
book3.id = 903;
book3.inPublication = false;
book3.ISBN = "902-11-13-1111";
book3.pageCount = 300;
book3.price = 25;
book3.productCategory = "Book";
book3.title = "My third book created in batch write";

System.out.println("Adding three books to ProductCatalog table.");
mapper.batchSave(Arrays.asList(book1, book2, book3));

}

1 Answer 1

2

Yes, you need to use batch save and construct the objects to be saved pro-grammatically. There is no tool like mongoimport to import the file directly.

However, you can use AWS data pipeline service to import the data into DynamoDB tables.

Sign up to request clarification or add additional context in comments.

1 Comment

I want to do this in a lambda Function, is there a way to link the pipeline to lambda function ?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.