2

I have two databases. One of them belongs to a CRM software and is the source.

The other one will be the destination used by a tool I'm developing.

The destination will contain a table ADDRESSES with a subset of the columns of a table of the same name in the source database.

What is the best (most efficient) way to copy the data between those databases (btw: they're on different SQL Server instances if that's important).

I could write a loop which does INSERT into the destination for each row obtained from the source but I don't think that this is efficient.

My thoughts and information:

  • The data won't be altered on its way from source to destination
  • It will be altered on its way back
  • I don't have the complete structure of the source but I know which fields I need and that they're warranted to be in the source (hence, access to the rows obtained from source isn't possible using the index of columns)
  • I can't use LINQ.

Anything leading me in the right direction here is appreciated.

Edit:
I really need a C# way to copy the data. I also need to know how to merge the copied rows back to the source. Is it really necessary (or even best practise) to do this row after row?

3
  • 1
    Use SqlBulkCopy and SqlDataReader as the input for that. Commented Oct 15, 2013 at 16:48
  • 1
    FYI, this might end up getting closed for being "too broad" or "off-topic." If you're looking for advice and concepts, try asking on programmers.stackexchange.com Commented Oct 15, 2013 at 16:49
  • So I don't know why you've ruled just using a DB link and forgo taking on of copying the data but I would probably consider using a Table-Valued Parameter Commented Oct 15, 2013 at 18:03

4 Answers 4

2

Why write code to do this?

The single fastest and easiest way is just to use SQL Server's bcp.exe utility (bcp: Bulk Copy Program).

  • Export the data from the source server.
  • Zip it or tar it if it needs it.
  • FTP it over to where it needs to go, if you need to move it to another box.
  • Import it into the destination server.

You can accomplish the same thing via SQL Server Management Studio in a number of different ways. Once you've defined the task, it can be saved and it can be scheduled.

You can use SQL Server's Powershell objects to do this as well.

If you're set on doing it in C#:

  • write your select query to get the data you want from the source server.
  • execute that and populate a temp file with the output.
  • execute SQL Server's bulk insert statement against the destination server to insert the data.

Note: For any of these techniques, you'll need to deal with identity columns if the target table has them. You'll also need to deal with key collisions. It is sometimes easier to bulk load the data into a perma-temp table first, and then apply the prerequisite transforms and manipulations to get it to where it needs to go.

Sign up to request clarification or add additional context in comments.

3 Comments

your approach with temp tables looks promising. I know that there are different kinds of temporary tables and different scopes in SQL Server. Could you give me a short example on how to build such a temporary table (using C# if possible)?
What I call a perma-temp table is just an ordinary [permanent] table. No keys. No unique indexes. No constraints. Just truncate it; load it with your data. and do the work you need to do to get it to its ultimate destination.
I'm currently in the middle of this article which uses the class "DataTable" to manipulate data. It seems like that's exactly what I need. If this leads to the solution of my problem, I'm going to accept your answer because you set me on track ;)
1

According to your comment on Jwrit's answer, you want two way syncs.

If so, you might want to look into Microsoft Sync Framework.

We use it to sync 200+ tables on Premise SQL to SQL Azure and SQL Azure to SQL Azure.

You can use purely C#. However, it might offer a lot more than you want, or it might be over kill for a small project.

I'm just saying so that you can have different option for your project.

Comments

0

If these databases exist on two servers you can setup a link between the servers by executing sp_addlinkedserver there are instructions for setting this up here. This may come in handy if you plan on regularly "Sharing" data.

http://msdn.microsoft.com/en-us/library/ff772782.aspx

Once the servers are linked a simple select statement can copy the rows from one table to another

INSERT INTO db1.tblA( Field1,Field2,Field2 ) SELECT Field1,Field2,Field2 FROM db2.tblB

If the Databases are on the same instance you only need to execute similar SQL to the above

5 Comments

there are multiple "destination" databases on laptops that will carried around once they got their data from the "source". I'd like to rather know a C# way to do this because copying from "source" to "destination" is only the first step. I have to sync them (SELECT -> UPDATE) back to the source while actually changing the rows instead of copying (SELECT -> INSERT) them.
ok, looks like you have updated your question with more detail. I did not originally realize you want to sync smaller client devices to a master DB. This task can get rather sticky if you havent set one up before. I used a product called Sybase Mobilink which handles the conflict resolution and synchronization of devices which have their own DB. If you would like to write something like this on your own you would have to make a Service of some kind which executes Select procs on behalf of the client and transports it over the network. Frankly its much easier to use a synchronization framework
But If your brave... here is a brief overview of a remote disconnected client DB setup. Each client has local db which tracks differences since last sync (using last_update column of sorts). Server has "Sync Scripts" Which are procs for each tbl the clients will need to DL. If you are doing download only... than there will only be one proc per table. The client connects to c# winservice on server and request sync of X tables. server reads devices last sync date and executes procs associated with the tables the client requests which retrieves only changes data (based on last update).
When a client needs to Insert or Update data, well things get much more complex because of conflict resolution (what if two clients try to update/delete/insert the same row) custom logic must be put into a proc to determine the winner. This setup mentioned above will protect your db from being pummeled directly by clients adding security and I have seen it used in several larger scale deployments.
thanks for your effort, but the sync rules are fortunately quiet easy ;) I'm currently trying to implement both, import and export, with DataTable and a SqlBulk-Insert/Update as proposed in the link I posted under Nicholas Curreys answer. I think that will do it :)
0

If this is one time - the best bet is normally SSIS (SQL server integration services), unless there are complex data transformations - you can quickly and easily do column mappings and have it done (reliably) in 15 mins flat......

1 Comment

I'd like to do this in C# because of a few reasons (see my comment under @Jwit's answer)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.