5

I'm trying to insert 100000 records in android sqlite database at a time. I'm using following two different methods.

 private void bulkInsertDataBySavePoint(final List<User> users) {
    log.debug("bulkInsertDataBySavePoint()");
    DatabaseConnection conn = null;
    Savepoint savepoint = null;
    try {
        conn = userDao.startThreadConnection();
        savepoint = conn.setSavePoint("bulk_insert");
        for (User user : users) {
            userDao.create(user);
        }
    } catch (SQLException e) {
        log.error("Something went wrong in bulk Insert", e);
    } finally {
        if (conn != null) {
            try {
                conn.commit(savepoint);
                userDao.endThreadConnection(conn);
            } catch (SQLException e) {
                log.error("Something went wrong in bulk Insert", e);
            }
        }
    }
}

And

   private void bulkInsertDataByCallBatchTasks(final List<User> users) {
    log.debug("bulkInsertDataByCallBatchTasks()");
    try {
        userDao.callBatchTasks(new Callable<Void>() {
            @Override
            public Void call() throws Exception {
                for (User user : users) {
                    userDao.create(user);
                }
                return null;
            }
        });
    } catch (Exception e) {
        e.printStackTrace();
    }
}

Both methods work fine. On average they take 140 seconds and take 60-65% CPU which is not ok, I think.

The idea is, I have to consume an api which will provide json data. I have to parse that json data and then insert into sqlite database for offline usage.

I'm looking for an efficient way to solve this issue.

Any thought?

5
  • When you used Traceview to determine exactly where your time is being taken, what did you learn? Commented Jul 3, 2013 at 19:28
  • I really don't much understand tracveiw. I get this time calling System.nanoTime() before and after my my bulk insert method call . Commented Jul 3, 2013 at 20:12
  • I've used the second method in one of my apps and it boosted response time drastically Commented Nov 12, 2014 at 15:03
  • duplicate (except save points) of Ormlite Android bulk inserts Commented Dec 5, 2014 at 11:53
  • I have to agree with @voghDev, the second one decreased loading time from 16,5 seconds to 1,7 seconds on 1100 items. Thank you both. :) Commented May 25, 2016 at 13:01

3 Answers 3

5

I'm trying to insert 100000 records in android sqlite database at a time... On average they take 140 seconds and take 60-65% CPU which is not ok in my opinion.

Unfortunately I don't have an easy answer for you. You may have to do this sort of insert directly using raw SQL to achieve faster performance on the limited Android CPU. Once you have the data inserted then you can turn to ORMLite to query or manipulate the data faster.

Sign up to request clarification or add additional context in comments.

5 Comments

Thank you @Gray, I was thinking the same.
@Gray I am curious about reason that a Dao method for bulk insert such as the following is not possible: int create(Collection<T> data) Thanks for a great ORM tool and for all the help you provide!
Hrm. Good idea @FarrukhNajmi. I've just added it to trunk. It will be in version 4.49.
Awesome responsiveness to community input. You and ORMLite rock!
Is it still a recommended way on Android ?
3

I've had the same problem, and found a reasonable workaround. This took insert time from 2 seconds to 150ms:

final OrmLiteSqliteOpenHelper myDbHelper = ...;
final SQLiteDatabase db = myDbHelper.getWritableDatabase();
db.beginTransaction();
try{
    // do ormlite stuff as usual, no callBatchTasks() needed

    db.setTransactionSuccessful();
}
finally {
    db.endTransaction();
}

5 Comments

Though I have a 100% reproducible evidence this code helps, it's not that simple and straightforward. See details here
this helped me a lot, from 15s it takes now about 2 seconds. Thank you !
I did, but this solution is much easier to implement and was successful
@sonique hmm... I really don't see how my solution is easier in the code than plain callBatchTasks (if performance is OK) :) Did you use it as in doc's examples (example with connection source, without) ?
I had many request happing (parsing JSON feed and inserting hundreds objects from 3-4 class in sqlite database), I think this solution is easier because I just had to add db.beginTransaction(); db.setTransactionSuccessful(); and db.endTransaction(); around my code. Where with callBatchTasks, it is link to a specific DAO object and it's a bite messy from my point of view. Finally, yeah performance has been improved a lot, about x100 faster.
1

Hrm. Good idea @FarrukhNajmi. I've just added it to trunk. It will be in version 4.49.

@Gray Is it still unstable? when can we see it in maven?

And if com.j256.ormlite.dao.ForeignCollection#addAll make only one request it would be nice too.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.