Beware this content is over 7 years old and considered stale. It may no longer be accurate and/or reflect the understanding of the author but remains here for reference only. Please keep this in mind as you use this content.

If you are a developer, no doubt you have a local development environment in which you build your website projects, testing every function, every change to make sure that once it’s ready to go live, you can upload the files and in preparation for launch.

Once you get a few projects under your belt, and especially where using fully featured Content Management Systems (CMS) like Joomla! or WordPress, where you can make 60-70% of the changes you need directly through the back-end administration, you’ll find your local/development version of your website quickly becomes out of date in comparison to the live site.

This is the point I discovered myself at recently (again) and what I normally do at this point is do a quick update to my local site by copy all files directly from the live site to my testing server but that’s the easy bit. The hard part to all this is copying the database from the live site to the local site which is where much of the configurations are stored, rather than the static files which you’ve already copied to your hard drive.

If you have been working with MySQL for any length of time, I hope you’ve come accross phpMyAdmin, which for the uninitiated, is quite possibly the best open source tool written in PHP and it allows you to handle the administration of MySQL databases over the Internet in a web browser.

Well, using phpMyAdmin, you can quite easily export the entire database to a .SQL dump file, which for my particular project was well over 50MB of pure data. This will take a few minutes to download, depending on your connection speed, but the problem is importing any large file into your local MySQL database where you do all your testing. Now, I don’t have exact limits, but uploading large files through web browsers is fraught with problems, most notably because most have a tendency to timeout after a specified period of time, and for files much larger than 50MB, that presents a real problem.

I’ll admit that there are ways around this, such as altering timeout settings on the script running the upload or changing the browser settings itself but there is an easier way to this, and I’m going to explain how (hopefully).

The easiest way to do it is via the command line on your computer. Now I know many beginners will at that moment freak out but it’s easier than you might think.

Step 1: Fire up the Command Prompt through: Start > Run and then type cmd and hit Enter/Return.

Step 2: Navigate to the directory where MySQL.exe is running. For XAMPP users, it’s probably:
C:\xampp\mysql\bin\

Step 3: Copy SQL file (lets call it dump.sql) that you exported (from your live database) to the same folder as where MySQL.exe is running. Copying it to any folder is fine but if it’s in the same folder it makes the commands easier.

Step 4: Type the following and hit Enter/Return:

1
mysql -u [USERNAME] -p -h [HOSTNAME] [DATABASE] < [FILE]

As an example, the command for Step 4, may look something like this:

1
mysql -u root -p -h localhost employees_database < new_employees.sql

Step 5: Once you hit Enter/Return, you’ll be prompted for the password associated with the [USERNAME] you gave in the command.

Step 6: Wait. If it’s a large file you won’t see anything happen but what you’re waiting for is a new command line to appear which is when you know the command has been processed. e.g.: C:\xampp\mysql\bin>

Well, that’s pretty much it. Head over to your local installation of phpMyAdmin to check all the tables are there. Remember to make backups of backups before you export or import or change anything. It costs nothing to be a little more cautious when handling large volumes of data.