0

I have generated 200 mega data script from local SQL Server, which I generated via Tasks -> Generate Script in SQL Server Management Studio.

When I try to execute it from the SQL Server Management Studio, I get massage that I have not enough memory to run it, which is weird I have 16 gigabytes of ram in my pc.

So my question is simple: how can I update my remote SQL Server with the data script ?

Can I configure SQL Server Management Studio to use more memory or there is better way to execute big script?

Thanks

1 Answer 1

1

Use SQLCMD..

sqlcmd -S myServer\instanceName -i C:\myScript.sqlsqlcmd -S myServer\instanceName -i C:\myScript.sql -o C:\EmpAdds.txt

This link provides details on why you are seeing the issue..

There is no hard-coded limit on the size of file that SSMS can open or execute, so the question of how large a file SSMS can open is hard to answer concretely. However, SSMS is a 32-bit application so its maximum address space is 2 GB. SSMS uses many managed code components that are garbage collected, so its working set is often over 1 GB without loading any files if you've been using any of the UI launched from from Object Explorer. If SSMS has been running for a while, the free memory for the process tends to become fragmented, so the largest contiguous free address range can be less than 100 MB. The text editor uses UNICODE (UTF-16) as its internal representation, so a 50 MB ASCII text file can take 100 MB of memory to store. The text editor allocates a single block of memory to store the document text, so if Windows can't allocate a block as big as the editor needs, the text editor can't even load the document.

When you try to execute a script, SSMS breaks the document up into SQL batches (which is one copy of the text), and then sends the batch to the server to execute (which is a second copy). The batches are much smaller than the document, so the free address block sizes aren't a problem, but having three copies of the document text in memory may not be possible if the script is large enough. The result is that after a restart, SSMS can usually load and edit documents that are larger than what it can execute.

After turning off intellisense, I've been able to load documents as large as 200 MB in my test machine just after starting SSMS, but I wasn't able to run them. That is probably close to the limit of what's possible in a lab environment. In the real world, with less beefy machines and actually using SSMS to get your job done, the practical limit is probably more like loading and executing a 40 MB script.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.