I need to move some data daily from a Sql Server database on one server to another Sql Server database on another server. I have complete read access on the origin server. The destination database is pulled from to process some transforms for an accounting system. I have to transform the data and use stored procedures for the destination database. After my data is loaded on the destination database a transform is triggered and my data is altered/moved.
We only want data changes sent to the destination database so we intend to use a temp database to compare it (on a different sql server) before sending anything.
We were thinking about using Entity Framework for reading and caching, but I'm worried that this would involve us creating two different models and comparing them before saving them. This would be a pain to do, but it would allow us to transform/modify the data as objects and would greatly simplify our business logic.
Is it recommended to have separate processes for this portion or to continue with two different data models?
To simplify:
- Load/compare data between databases
- Update temp database with new data
- Set flag
- Load all objects in temp database with flag set to true using entity framework
- Transform/modify temp objects per business logic
- Send to destination using stored procedures
Or
- Load data from origin database using entity framework
- Load data from temp database using entity framework
- Compare
- Update temp objects with new data
- Transform/modify temp object per business logic
- Send to destination using stored procedures
If I choose the first option, is it worth my time to use entity framework any more? Am I over thinking this and there is a better way around this entirely?