0

I'm developing a simple utility which is going to create a *.bak file from a SQL Server database.

If database has a quite small size, less than 50 Mb, then there is no problem, it's working well. But potentially I'm going to work with big databases of 2-3Gb data size.

Since it's impossible (I guess) to keep such a big data in the memory to create *.bak file, would my utility still work in this case?

5
  • It's impossible to say without seeing the code. Commented Oct 30, 2012 at 17:39
  • You have not constructed your question well. Commented Oct 30, 2012 at 17:49
  • 3
    How does the SQL Server Management Studio backup of such large databases work? The backup process is streaming the data to disk - chunk by chunk. It's not keeping the whole database in memory and then dumping it to disk.... Commented Oct 30, 2012 at 17:50
  • 1
    insert junk into db, test yourself Commented Oct 30, 2012 at 17:53
  • Note: Default command timeout is set to 10 minutes, you can set property called StatementTimeout in the ServerConnection class. Setting this to zero disables the timeout. Commented Oct 30, 2012 at 18:03

2 Answers 2

2

Yes, your utility will still work with databases that are 2 -3 Gb in size.

Sign up to request clarification or add additional context in comments.

Comments

1

Yes! behind the scene Sql Server Management Studio uses smo models for it`s tasks such as backup and restoring. it can handle backups so also you can do your job using smo functionality.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.