1

I am somewhat new to Core Data and have a general question.

In my current project, users can access data reported by various sensors in each county of my state. Each sensor is represented in a table view which gathers its data from a web service call. Calling the web service could take some time since this app may be used in rural areas with slow wireless connectivity. Furthermore, users will typically only need data from one or two of the state's 55 counties. Each county could have anywhere from 15 to 500 items returned by the web service. Since the sensor names and locations change rarely, I would like the app to cache the data from the web service call to make gathering the list of sensors locations faster (and offer a refresh button for cases where something has changed). The app already uses Core Data to store bookmarked sensor locations, so it is already set up in the app.

My issue is whether to use Core Data to cache the list of sensors, or to use a SqlLite data store. Since there js already a data model in place, I could simply add another entity to the model. However, I am concerned about whether this would introduce unnecessary overhead, or maybe none at all.

Being new to Core Data, it appears that all that is really happening is that objects are serialized and their properties added as fields in a SqlLite DB managed by Core Data. If this is the case, it seems there really would not be any overhead from using the Core Data store already in place.

Can anyone help clear this up for me? Thanks!

1
  • 1
    I wouldn't worry, because it's actually vice versa: Core Data is a CPU and disk bound operation, while a slow network is a network bound operation. The slower the network operation, the faster Core Data - in relation. On a system with multiple CPUs and a slow network, Core Data will probably not adding anything to the download duration. But on a fast network, Core Data may contribute with 90% of the total time! Commented Nov 12, 2013 at 16:48

2 Answers 2

1

it appears that all that is really happening is that objects are serialized and their properties added as fields in a SqlLite DB managed by Core Data

You are right about that. Core Data does a lot more, but that's the basic functionality (if you tell it to use a SQLite store, which is what most people do).

As for the number of records you want to store in Core Data, that shouldn't be a problem. I'm working on a Core Data App right now that also stores over 20,000 records in Core Data and I still get very fast fetch times e.g. for auto completion while typing.

Core Data definitely adds some overhead, but if you only have few entites and relationships and are not creating/modifying objects in more than one context, it is negligible.

Sign up to request clarification or add additional context in comments.

1 Comment

If a user stored data for all 55 counties, the total number of records would be around 4000. This is unlikely. Most users will only need data for one to three counties
1

Being new to Core Data, it appears that all that is really happening is that objects are serialized and their properties added as fields in a SqlLite DB managed by Core Data. If this is the case, it seems there really would not be any overhead from using the Core Data store already in place.

That's not always the case. Core Data hides its storage implementation from the developer. It is sometimes a SQL db, but in other cases it can be a different data storage. If you need a comprehensive guide to CoreData, I recommend this objc.io article.

As @CouchDeveloper noted, CoreData is a disk io/CPU bound process. If you notice performance hits, throw it in a background thread (yes - this is a pretty big headache), but it will always be faster than the average network.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.