I'm trying to import 15GB .sql file in postgreysql or Mysql database. What is the faster way or procedure to import such a big dataset in short time.
Any suggestion will be greatly appreciated ?
I'm trying to import 15GB .sql file in postgreysql or Mysql database. What is the faster way or procedure to import such a big dataset in short time.
Any suggestion will be greatly appreciated ?
To start with, there's really no such thing as a ".sql file". It's like saying a ".dat file", it could be practically anything. A list of INSERTs. A script to create tables. A query that extracts information from an existing database. Etc.
The file might contain table and index definitions (DDL) and other content, or it might just be a list of INSERT statements. It could be written to use custom vendor extensions like PostgreSQL's COPY command for fast data loading, too.
You need to look at the file and see what it is. Determine if you need to create tables to hold the data first. See if you need to change any DDL to be compatible with the target database, since unfortunately the standard names for SQL data types aren't followed all that consistently by database vendors, there are vendor extensions for things like key generation, etc.
If it's plain INSERTs into a single table and the inserts don't depend on each other the fastest way to load it into PostgreSQL is to split it into several chunks and run each chunk with psql -1 -v ON_ERROR_ROLLBACK=1 -f chunk.sql.
Otherwise you'd just have to psql -1 -v ON_ERROR_ROLLBACK=1 -f thefile.sql.
The fastest way to load data into PostgreSQL is to use pg_bulkload, but that's quite disruptive and I don't think it'll take pre-formatted SQL input. The next-best option is the COPY command, but that also works with CSV/TSV, not with SQL formatted data written as INSERTs.