0

This is input json file. Javascript code is written to iterate using MongoDB.

{
  "Includes": {
    "Employees": {
      "14": {
        "name": "john",
        "age": 12,
        "activity": {
          "Count": 3502,
          "RatingValue": 5
        }
      },
      "17": {
        "name": "smith",
        "age": 23,
        "activity": {
          "Count": 232,
          "RatingValue": 5
        }
      }
    }
  }
}

Javascript function

var result = [];

db.details.find().forEach(function(doc) {
    var Employees = doc.Includes.Employees;
    if (Employees) {
        for (var key in Employees) {
            var Employee = Employees[key];
            var item = [];
            item.push(key);
            item.push(Employee.name);
            item.push(Employee.age);
            item.push(Employee.activity.Count);
            item.push(Employee.activity.RatingValue);
            result.push(item.join(","));
        }
    }
});

print(result);

I want the output to be written to csv file in 3 rows with 5 columns, each column containing a value in this pattern

Id name age count RatingValue

14 john age 3502 5

17 smith 23 232 5

1 Answer 1

1

Change that final print(result); to the following:

print("Id,name,age,count,RatingValue");
print(result.join("\n"));

Note: The first line is just for the column headers; the second line prints each employee result on a separate line.

Then call your script and direct the output to a CSV file like so:

mongo --quiet "full-path-to-script.js" > "full-path-to-output.csv"

Note: The --quiet arg suppresses the standard Mongo header output (shell version and initial database).

I created a details collection, and added your JSON document to it, and then running the modified script resulted in the following CSV file content:

Id,name,age,count,RatingValue
14,john,12,3502,5
17,smith,23,232,5
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.