12

I am struggling with importing data into Mongodb from a Json file.
I can do the same in command line by using mongoimport command.
I explored and tried lot but not able to import from Json file using java.

sample.json

    { "test_id" : 1245362, "name" : "ganesh", "age" : "28", "Job" : 
       {"company name" : "company1", "designation" : "SSE" } 
    }

    { "test_id" : 254152, "name" : "Alex", "age" : "26", "Job" :
       {"company name" : "company2", "designation" : "ML" } 
    }

Thank for your time. ~Ganesh~

2
  • Java? You need to insert data with java driver ? Commented Oct 29, 2014 at 6:46
  • Thanks stalk, Yes I tried with using mongodb java drive. The mongodb drive is not supporting import data from Json file directly. Commented Oct 29, 2014 at 6:49

7 Answers 7

16

Suppose you can read the JSON string respectively. For example, you read the first JSON text

{ "test_id" : 1245362, "name" : "ganesh", "age" : "28", "Job" : 
   {"company name" : "company1", "designation" : "SSE" } 
}

and assign it to a variable (String json1), the next step is to parse it,

DBObject dbo = (DBObject) com.mongodb.util.JSON.parse(json1);

put all dbo into a list,

List<DBObject> list = new ArrayList<>();
list.add(dbo);

then save them into database:

new MongoClient().getDB("test").getCollection("collection").insert(list);

EDIT:

In the newest MongoDB Version you have to use Documents instead of DBObject, and the methods for adding the object look different now. Here's an updated example:

Imports are:

import com.mongodb.MongoClient;
import com.mongodb.client.MongoDatabase;
import org.bson.Document;

The code would like this (refering to the text above the EDIT):

Document doc = Document.parse(json1);
new MongoClient().getDataBase("db").getCollection("collection").insertOne(doc);

you can also do it the way with the list. but then you need

new MongoClient().getDataBase("db").getCollection("collection").insertMany(list);

But I think there is a problem with this solution. When you type:

db.collection.find()

in the mongo shell to get all objects in the collection, the result looks like the following:

{ "_id" : ObjectId("56a0d2ddbc7c512984be5d97"),
    "test_id" : 1245362, "name" : "ganesh", "age" : "28", "Job" :
        { "company name" : "company1", "designation" : "SSE" 
    }
}

which is not exactly the same as before.

Sign up to request clarification or add additional context in comments.

Comments

1

Had a similar "problem" myself and ended up using Jackson with POJO databinding, and Morphia.

While this sound a bit like cracking a nut with a sledgehammer, it is actually very easy to use, robust and quite performant and easy to maintain code wise.

Small caveat: You need to map your test_id field to MongoDB's _id if you want to reuse it.

Step 1: Create an annotated bean

You need to hint Jackson how to map the data from a JSON file to a POJO. I shortened the class a bit for the sake of readability:

@JsonRootName(value="person")
@Entity
public class Person {

  @JsonProperty(value="test_id")
  @Id
  Integer id;

  String name;

  public Integer getId() {
    return id;
  }

  public void setId(Integer id) {
    this.id = id;
  }

  public String getName() {
    return name;
  }

  public void setName(String name) {
    this.name = name;
  }

}

As for the embedded document Job, please have a look at the POJO data binding examples linked.

Step 2: Map the POJO and create a datastore

Somewhere during your application initialization, you need to map the annotated POJO. Since you already should have a MongoClient, I am going to reuse that ;)

Morphia morphia = new Morphia();
morphia.map(Person.class);

/* You can reuse this datastore */
Datastore datastore = morphia.createDatastore(mongoClient, "myDatabase");

/* 
 * Jackson's ObjectMapper, which is reusable, too,
 * does all the magic.
 */
ObjectMapper mapper = new ObjectMapper();

Do the actual importing

Now importing a given JSON file becomes as easy as

public Boolean importJson(Datastore ds, ObjectMapper mapper, String filename) {

    try {           
        JsonParser parser = new JsonFactory().createParser(new FileReader(filename));
        Iterator<Person> it = mapper.readValues(parser, Person.class);

        while(it.hasNext()) {
            ds.save(it.next());
        }

        return Boolean.TRUE;

    } catch (JsonParseException e) {
        /* Json was invalid, deal with it here */
    } catch (JsonMappingException e) {
        /* Jackson was not able to map
         * the JSON values to the bean properties,
         * possibly because of
         * insufficient mapping information.
         */
    } catch (IOException e) {
        /* Most likely, the file was not readable
         * Should be rather thrown, but was
         * cought for the sake of showing what can happen
         */
    }

    return Boolean.FALSE;
}

With a bit of refatcoring, this can be converted in a generic importer for Jackson annotated beans. Obviously, I left out some special cases, but this would out of the scope of this answer.

Comments

1

With 3.2 driver, if you have a mongo collection and a collection of json documents e.g:

MongoCollection<Document> collection = ...
List<String> jsons = ...

You can insert individually:

jsons.stream().map(Document::parse).forEach(collection::insertOne);

or bulk:

collection.insertMany(
        jsons.stream().map(Document::parse).collect(Collectors.toList())
); 

1 Comment

nice, did something similar ClassPathResource classPathResource = new ClassPathResource("incidents.json"); List<String> jsons = new ArrayList<>(); Stream<String> stream = Files.lines(Paths.get(classPathResource.getFile().toURI()), StandardCharsets.UTF_8); stream.forEach(jsons::add); jsons.stream().map(Document::parse).forEach(i -> mongoTemplate.insert(i, "incidents"));
1

I just faced this issue today and solved it in another different way while none here satisfied me, so enjoy my extra contribution. Performances are sufficient to export 30k documents and import them in my Springboot app for integration test cases (takes a few seconds).

First, the way your export your data in the first place matters. I wanted a file where each line contains 1 document that I can parse in my java app.

mongo db --eval 'db.data.find({}).limit(30000).forEach(function(f){print(tojson(f, "", true))})' --quiet > dataset.json

Then I get the file from my resources folder, parse it, extract lines, and process them with mongoTemplate. Could use a buffer.

@Autowired    
private MongoTemplate mongoTemplate;

public void createDataSet(){
    mongoTemplate.dropCollection("data");
    try {
        InputStream inputStream = Thread.currentThread().getContextClassLoader().getResourceAsStream(DATASET_JSON);
        List<Document> documents = new ArrayList<>();
        String line;
        InputStreamReader isr = new InputStreamReader(inputStream, Charset.forName("UTF-8"));
        BufferedReader br = new BufferedReader(isr);
        while ((line = br.readLine()) != null) {
            documents.add(Document.parse(line));
        }
        mongoTemplate.insert(documents,"data");


    } catch (Exception e) {
        throw new RuntimeException(e);
    }
}

Comments

1
List<Document> jsonList = new ArrayList<Document>();
net.sf.json.JSONArray array = net.sf.json.JSONArray.fromObject(json);
for (Object object : array) {
    net.sf.json.JSONObject jsonStr = (net.sf.json.JSONObject)JSONSerializer.toJSON(object);
    Document jsnObject = Document.parse(jsonStr.toString()); 
    jsonList.add(jsnObject);
}
collection.insertMany(jsonList);

1 Comment

Please omit code only answers. Add some context, explanation etc.
1

Runtime r = Runtime.getRuntime();

Process p = null;

//dir is the path to where your mongoimport is.

File dir=new File("C:/Program Files/MongoDB/Server/3.2/bin");

//this line will open your shell in giving dir, the command for import is exactly same as you use mongoimport in command promote

p = r.exec("c:/windows/system32/cmd.exe /c mongoimport --db mydb --collection student --type csv --file student.csv --headerline" ,null,dir);

Comments

-1
public static void importCSV(String path) {

        try {
            List<Document> list = new ArrayList<>();
            MongoDatabase db = DbConnection.getDbConnection();
            db.createCollection("newCollection");
            MongoCollection<Document> collection = db.getCollection("newCollection");
            BufferedReader reader = new BufferedReader(new FileReader(path));
            String line;
            while ((line = reader.readLine()) != null) {
                String[] item = line.split(","); // csv file is "" separated
                String id = item[0]; // get the value in the csv assign keywords
                String first_name = item[1];
                String last_name = item[2];
                String address = item[3];
                String gender = item[4];
                String dob = item[5];
                Document document = new Document(); // create a document
                document.put("id", id); // data into the database
                document.put("first_name", first_name);
                document.put("last_name", last_name);
                document.put("address", address);
                document.put("gender", gender);
                document.put("dob", dob);
                list.add(document);
            }
            collection.insertMany(list);

        }catch (Exception e){
            System.out.println(e);
        }
    }

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.