1

Don't ask me why, but I have to migrate a database from 2008 to 2005. This is not an issue of itself, but I have a very large table.

When I script the tables contents (using Generate Scripts), the .sql file made is over 4gb. This is more memory than the server has available in RAM.

Is there anyway to generate insert commands that splits into multiple files?

Or is there a way to split a file into multiple files with the expectation that the file is the larger than the amount of RAM available?

4
  • @OMG Ponies -- Are you thinking of DTS instead of scripting? Commented Sep 3, 2010 at 23:56
  • This isn't really a programming question. I think serverfault.com is a better venue for this question. Commented Sep 4, 2010 at 0:11
  • @Gabe: Script generation is programming? Commented Sep 4, 2010 at 0:24
  • You could make this into a programming project, but it's really a sysadmin/DBA question. Commented Sep 4, 2010 at 1:08

1 Answer 1

1

Why script the data out?

I'd use SSIS or some other programmatical method after scripting/generating my schema.

Or use something like Red Gate Compare tools

I've almost never generated DML scripts this way.

However, SSMS tools pack does offer batched INSERT generation and it's free

Sign up to request clarification or add additional context in comments.

1 Comment

+1: I was looking to see if Linked Servers could be used...

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.