0

I need to move some data daily from a Sql Server database on one server to another Sql Server database on another server. I have complete read access on the origin server. The destination database is pulled from to process some transforms for an accounting system. I have to transform the data and use stored procedures for the destination database. After my data is loaded on the destination database a transform is triggered and my data is altered/moved.

We only want data changes sent to the destination database so we intend to use a temp database to compare it (on a different sql server) before sending anything.

We were thinking about using Entity Framework for reading and caching, but I'm worried that this would involve us creating two different models and comparing them before saving them. This would be a pain to do, but it would allow us to transform/modify the data as objects and would greatly simplify our business logic.

Is it recommended to have separate processes for this portion or to continue with two different data models?

To simplify:

  1. Load/compare data between databases
  2. Update temp database with new data
  3. Set flag
  4. Load all objects in temp database with flag set to true using entity framework
  5. Transform/modify temp objects per business logic
  6. Send to destination using stored procedures

Or

  1. Load data from origin database using entity framework
  2. Load data from temp database using entity framework
  3. Compare
  4. Update temp objects with new data
  5. Transform/modify temp object per business logic
  6. Send to destination using stored procedures

If I choose the first option, is it worth my time to use entity framework any more? Am I over thinking this and there is a better way around this entirely?

1 Answer 1

1

1: create a 2 database context models. lets say datacontext1 and datacontext2.

2: Create a dto(Data Transfer Object) which will be very similar to the tables that are being transferred from datacontext1 to datacontext2.

3: Use automapper to map the properties of datacontext1 to the dto that is created as well as mapping datacontext2 to the dto.

4: run the use a service layer with functions of "public dto readdata(table1)" and "public void savedata(dto)"

5: read the data from dbcontext1 and save it to dbcontext2

this way if anything changes in database1 or database2, it would be very easy to handle the change.

Sign up to request clarification or add additional context in comments.

8 Comments

I'm exploring this. Which project would you set the mapping properties?
the project that receives the datacontext1 and datacontext2 and processes them is the one you'd wanna setup mappings in
What if I wrapped the datacontexts in a repositories? I figured this would simplify it for future use. I also need to be able to process business logic aside from the copy.
wherever the processes take place is where you want to have the automapper. you can have the dbcontect in repository and if the processes happen on a layer such as service then thats where automapper goes. you also can keep the automapper on repository layer and add its reference in any other layer that its being used but if you dont need it in repository, why putting it there at the first place.
Does it make sense to expose my project to both the entity objects and dto objects? It seems counter intuitive. I was thinking of having a class in the repository project with a static method that would be called from the program entry point. Thanks for the answer and comments I really appreciate it.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.