I'm trying to insert 100000 records in android sqlite database at a time. I'm using following two different methods.
private void bulkInsertDataBySavePoint(final List<User> users) {
log.debug("bulkInsertDataBySavePoint()");
DatabaseConnection conn = null;
Savepoint savepoint = null;
try {
conn = userDao.startThreadConnection();
savepoint = conn.setSavePoint("bulk_insert");
for (User user : users) {
userDao.create(user);
}
} catch (SQLException e) {
log.error("Something went wrong in bulk Insert", e);
} finally {
if (conn != null) {
try {
conn.commit(savepoint);
userDao.endThreadConnection(conn);
} catch (SQLException e) {
log.error("Something went wrong in bulk Insert", e);
}
}
}
}
And
private void bulkInsertDataByCallBatchTasks(final List<User> users) {
log.debug("bulkInsertDataByCallBatchTasks()");
try {
userDao.callBatchTasks(new Callable<Void>() {
@Override
public Void call() throws Exception {
for (User user : users) {
userDao.create(user);
}
return null;
}
});
} catch (Exception e) {
e.printStackTrace();
}
}
Both methods work fine. On average they take 140 seconds and take 60-65% CPU which is not ok, I think.
The idea is, I have to consume an api which will provide json data. I have to parse that json data and then insert into sqlite database for offline usage.
I'm looking for an efficient way to solve this issue.
Any thought?