Skip to Main Content

Importing Large Database Dumps

I recently acquired a client with an existing site that I needed to make a copy of locally and set up to work with version control. Usually this is not an issue, but with this particular client the site has 50,000+ entries and the database is 1.1+gb in size.

I recently acquired a client with an existing site that I needed to make a copy of locally and set up to work with version control. Usually this is not an issue, but with this particular client the site has 50,000+ entries and the database is 1.1+gb in size. I was able to take a database dump with Navicat and also with phpmyadmin, however importing the database was proving to be rather problematic and failed several times. The database import also failed using SSH with no error message.

I did some research and decided to give bigdump a shot. Big dump was super easy to set up and started working right away. It looked like it was going to take a long time, but at least I'd have the database imported. However about 5 minutes in the import failed and I was informed that it was due to extended inserts. The solution was to take a new dump extended inserts turned off and try again.

That's easy enough in phpmyadmin. When taking the dump go to Export and choose Custom display all possible options. Then scroll down to Syntax to use when inserting data and choose the first option.

This is fine, but I prefer to use Navicat to enable to me to work on the database remotely, plus it has a much better interface than phpmyadmin. I had the same problem here and it took some looking but I was able to find where to turn off extended inserts. Open your database, right click and choose data transfer. SEt up your connection to transfer from and the target connection - you can choose either another connection or a file. then go to the Advanced tab and uncheck (checked by default) Use extended insert statements. Now everything works as expected.