How to import really huge CSV files very fast in one shot in remote MYSQL database server?
SOLUTION: Load Data command on MySQL
LOAD DATA command in MySQL is a very fast & reliable way to import a huge CSV (containing several millilon lines/rows) to a remote MySQL server. Here is the syntax:
LOAD DATA LOCAL INFILE 'c:/csv/eventsbig.csv' IGNORE INTO TABLE 'testdb'.'eventsbig' FIELDS TERMINATED by ’,’ FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' LINES TERMINATED BY ’\n’ IGNORE 1 LINES (`evt_type`, `evt_name`, `billing_date`);
Please refer here for further reading on LOAD DATA command on MySQL.
MYSQLIMPORT Utility:
Chris Eiffel wrote here how he has created a mysqlimport utility based on the LOAD DATA command to import a large csv file to a remote MySQL server.
https://dev.mysql.com/doc/refman/8.0/en/mysqlimport.html
NOTE: While using the mysqlimport command with SSH, keep the following in mind:
-
- Be sure that you have an SSH Account/access to the terminal on the remote server/computer with the MySQL Server.
- Remember if your database is big, be sure to put the index/primary/unique keys on the tables created before you upload the CSV file to populate data into the database.
Another PHP script has been posted by Sanath which can help you in uploading a large CSV file to a remote MySQL Database server. You can download it from here
Happy programming!
Another pure PHP approach: A PHP script that will import a huge CSV file into MySQL database (asynchronously and fast!)
Tested this code with 400,000 rows and the importing is done in seconds.
https://stackoverflow.com/questions/32504778/how-to-import-huge-csv-file-with-200-00-rows-to-mysql-asynchronous-and-fast