0

I'm dealing with a large scale DB that grows every day. Pulling the required data from this DB involves some joins and due to the large amount of data it takes too long. A friend suggested the following:

Once a day pull all the required data from this DB and write it to a binary file which will reside in your source-control. Next, create a dal implementation which will work against this binary file, this way things should work smoothly.

I'm not familiar with this methodology and I'm wondering - is it a good practice? What are the advantages and disadvantages of such a practice and finally is there any reference for such an implementation (currently I'm using JPA)?

Thanks in advance

1 Answer 1

1

A few things,

You could probably optimize how you are querying your data. See, http://java-persistence-performance.blogspot.com/2010/08/batch-fetching-optimizing-object-graph.html

If you enable a cache in JPA (or EclipseLink) then you could avoid having the query the database for the data. See, http://wiki.eclipse.org/EclipseLink/UserGuide/JPA/Basic_JPA_Development/Caching

You could archive the data in the database to avoid it growing to big. Or partition the data so only the current data is in the table.

You could use a local or in-memory database for your current data, or use a product such as Oracle TimesTen, or a caching product such as Oracle Coherence.

Sign up to request clarification or add additional context in comments.

1 Comment

Thank you James, I will sure look into those!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.