I'm trying to dump approx 2.2 million objects in mongodb (using mongoose). The problem is when I save all the objects one by one It gets stuck. I've kepts a sample code below. If I run this code for 50,000 it works great. But if I increase data size to approx 500,000 it gets stuck.I want to know what is wrong with this approach and I want to find a better way to do this. I'm quite new to nodejs. I've tried loop's and everything no help finally I found this kind of solution. This one works fine for 50k objects but gets stuck for 2.2 Million objects. and I get this after sometime
FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory Aborted (core dumped)
var connection = mongoose.createConnection("mongodb://localhost/entity");
var entitySchema = new mongoose.Schema({
name: String
, date: Date
, close : Number
, volume: Number
, adjClose: Number
});
var Entity = connection.model('entity', entitySchema)
var mongoobjs =["2.2 Millions obejcts here populating in code"] // works completely fine till here
async.map(mongoobjs, function(object, next){
Obj = new Entity({
name : object.name
, date: object.date
, close : object.close
, volume: object.volume
, adjClose: object.adjClose
});
Obj.save(next);
}, function(){console.log("Saved")});