I have over 18GB of database .sql file, which have over 200 Millions records. My server have Xeon E5 processor, 16GB ECC RAM and SSDs. I have SQL Server 2014 Web Edition and Windows Server 2012.
I know I would need sqlcmd to run huge database so my command is this
sqlcmd -S SQL-Server2014 -i C:\main_db.sql -d newtest -o D:\main_db_log.txt
When I start this command, T-SQL command line utility start working it's magic. But after few minutes it use full 16GB of RAM and server crashes.
main_db_log.txt show's this error
Sqlcmd: Error: Scripting error.
I tried to restrict SQL Server to use only 10GB by limiting memory in ServerInstance > Properties > Memory Tab but it didn't work.
I researched around but only found out about sqlcmd to run large databases but it's not working in my case.
Please help, because our old server crashed and I can't make .bak file there anymore.
execute, I am a newbie with Databases. I just want to add the whole database's records innewtestin MS SQL Server 2014 but I only used the single line abovesqlcmd. Still having memory issues and scripting error.main_db.sql? Also look at stackoverflow.com/questions/24665757/sqlcmd-scripting-error and dba.stackexchange.com/questions/16763/…insertcommands on each line with just two columns, 1st isidand second issales_idand database slze is18GB. I saw the questions you recommended, should I useGOevery 1000 lines?INSERTstatements yes, it would make sense to break them into smaller batches.