Manual database / website backup (export)

Floris

I'm just me :) Hi.
Staff member
Joined
Jan 1, 2001
Messages
60,096
Hey everybody :)

Nice of you to check out this thread, this is a simple reminder that you shouldn't forget to do a manual backup once in a while. You can still do your automatic backups, and you should have those. But a manual backup never hurts. You can get a better impression of how big your website is file wise, and how large your database has grown. This is especially handy right before doing some maintenance.

These commands are how I used it today on our Ubuntu linux setup, for macOS/Windows and some other linux distributions this might differ, but the principle is the same. If you have alternative instructions please don't hesitate and share them.

The most important data lies in the database itself, so I prefer to at least get a dump of that, even before I run a repair and optimize on any tables. I just want the data as it is on the system right now. I can always do some maintenance on the system and then do another import to compare the results.

Tip: Once you have a backup, don't forget to test it!

I usually connect with ssh to the system where the MySQL database is running on, and you can (should) do this as a regular user. You don't need to be root/sudo for this one. Once connected, I type the mysql command, using the user and database name, and tell it to output the result to an .sql file. Which I then zip (gzip) up with the tar command to get a compressed tarball, which I download offsite and archive properly.

Tip: Make sure your filenames are accurate, and your archived file is consistent with the rest of the archive and at the least mentions what kind of file it is!

Code:
mysqldump -u sqlUsername -p sqlDatabase > backup-sqlDatabase.sql

Tip: Do not include the password in the command, there's no need to see it in the bash history.

Okay, let's walk through that command. And then I will show you how it looks basically when you use it.

mysqldump
This is the program we're using to export sql data.

-u sqlUsername
The -u basically means: use this username.
The sqlUsername basically means: and this is the username that has access to the database we want to do something with.
For example, the omgboards.com database might be called: omgforums, so then this would be: -u omgforums

-p sqlDatabase
Please note! This might be confusing, but the -p is ONE thing, and sqlDatabase is ANOTHER thing. The sqlDatabase is NOT the password.
The -p basically means: And ask me for the password when I press enter.

sqlDatabase
The sqlDatabase part refers to the sql database we want to do something with. Again, this shows right after the -p parameter, but it's not the password.
For example, on omgboards.com we might call our database omgdb, so it would look like -u omgforums -p omgdb, I hope that makes sense.

> backup-sqlDatabase.sql
The > character means this basically: the output of this command, go over there with it (do this with it)
The backup-sqlDatabase.sql part is the filename we wish to use. This is where we write to. Note please that you don't have to create this file first, it will create it if it doesn't exist.
For example, on omgboards.com we might call this backup file: sqlbackup-omgboards.sql

Okay, so please adjust the values of the database to your specific database with it's specific user, and press enter. It will ask for a password. The password is not your system password, but the password for the database user you've provided.

This is for example how it looks like:

Code:
userx@omgboards:~/backups$ mysqldump -u omgforums -p omgdb > sqlbackup-omgboards.sql
Enter password:
userx@omgboards:~/backups$ ls -all
drwxr-xr-x 2 userx userx       4096 Mar 14 13:52 .
drwxr-xr-x 7 userx userx       4096 Mar 14 13:44 ..
-rw-rw-r-- 1 userx userx 900M Mar 14 13:52 sqlbackup-omgboards.sql

Okay, let's take a step back. This seems quite confusing. But it really isn't.
Usually you have properly written down the database details (most web apps have a config.php file (like the software XenForo that we're currently using for our website here) and these details are a copy/paste away into the command. Basically all you have to do is enter a command to dump the database to a single file. Then compress it and download it to your archive.

Let's get to that part, the compressing of the backup and archiving it.

To archive things properly, we have to at least zip up the .sql dump. Why? Because it's usually just text. Most database .sql dumps consist of threads, posts, user data, and forum structures, etc. This is all text. Which compresses quite nicely. Another thing we also want to do is create a tarball (tar is an archiving utility which also supports compression).

I personally prefer gzip archives, so I will compress it into .gz, but you can lookup the manual for tar and use bzip or zip, etc).

The command for our example: backup-sqlDatabase.sql, would be:

Code:
tar -czf backup-mysql-march-14th-2018-backup-sqlDatabase-export.tar.gz backup-sqlDatabase.sql

Please note, the file we have is "backup-sqlDatabase.sql", so I remove the .sql from it, and include the "backup-sqlDatabase" part inside the resulting filename, for the way I do things, this works for me.

Okay, let's walk through that command, tar.

tar
The archiving command we're using to create tarball gz files.

-czf
Three parameters to tell the tar command we're not extracting, but compression (c) and compressing with zgip (z) from a file (f).

backup-mysql-march-14th-2018-backup-sqlDatabase-export.tar.gz
This part is written by me, this is the resulting filename we end up with, I strongly recommend ending it with .tar.gz (regardless of what name you give the filename)
To breakdown the filename, I always prefix my archivable exports with backup-, then I add a readable timestamp (you can use whatever you want of course), then the filename, and since this was an sql export, I include that it's an export.

This is for example how it looks like:
Code:
userx@omgboards:~/backups$ tar -czf backup-mysql-march-14th-2018-backup-sqlDatabase-export.tar.gz backup-sqlDatabase.sql
userx@omgboards:~/backups$ ls -all *.gz
-rw-rw-r-- 1 userx userx 250M Mar 14 13:54 backup-mysql-march-14th-2018-backup-sqlDatabase-export.tar.gz
-rw-r--r-- 1 userx userx 980G Mar 14 13:40 backup-omgboards-com-march-14th-2018-backup-omgWebfiles-export.tar.gz
userx@omgboards:~/backups$

As you can see here, the tarball that's compressed, ends up as a 250 megabyte file. This takes up way less space, it's faster and easier to download, and you can put more exports in your archive and not fill up the drives so quickly.

What you will also spot in that listing, is the webfiles export, that was already there. And we will get into that in the second post of this thread :)

Before we do that, let's explain something and summarize what we've done.

TLDR
Backing up a website is important, make a manual backup and test it. Using the mysqldump command you can tell the system to dump a specific database to the file system. Before downloading and archiving your backup, you can use the tar command to create a compressed tarball for easy downloading and storing. Being able to quickly dump your database and archive it, makes it easier to roll back to previous working versions, clone & test upgrade software/sites, fix problems, or move to another host.
mysqldump -u [username] -p [database name] > [database name].sql
tar -czf [filename].tar.gz [database name].sql
 
Last edited:

Floris

I'm just me :) Hi.
Staff member
Joined
Jan 1, 2001
Messages
60,096
Now that we know how to use tar, we can also use it to backup the web files, they might not just have the files from the software you're using (which are easy to replace if you have the original release files) but these files might have been customized, there might be uploads/ or data/ directories by users, there might be attachments/ or profile/ pictures, or simply files for caching, mod rewrite instructions, and other things that through the years get added and don't quite fit an sql database.

We're going to assume you're using the default Apache's /var/www/ directory for your website. Check your Apache's configuration files and vhosts files to see where you store your website files. For example, for omgboards.com we have it under /var/www/omgboards.com/public_html/*

To use tar to compress all the files from that directory, we will be using this command:

Tip: We log in as userx, a regular user, and not a sudo user or root user. And we have the correct permissions to write to the directory.

Code:
tar -czf
backup-omgboards-com-march-14th-2018-backup-omgWebfiles-export.tar.gz
/var/www/omgboards.com

Note please, we use omgboards.com/ directory, and not only omgboards.com/public_html/ because for security reasons we've moved some important files that the software includes outside of the public html directory, and we wish to include these in our backups of course.

tar -czf
See the above post, this is where we tell tar to compress to archive, using gzip, from the file. And yes, this can also be a directory, it basically means 'file system, not some tape storage system from back in the day'

backup-omgboards-com-march-14th-2018-backup-omgWebfiles-export.tar.gz
Again, I prefix it with backup, then I tell it which website these files are for, then the date, and the filename i am compressing (which in this case wasn't really first made into a single file, but it tells me anyway this is a backup for the omg web files) and that it's an export. Ending with the postfix of .tar.gz

You're done. Unless there was an error, which might very incidentally happen when you might not have the permission to export a sub directory or a file within it. To resolve that, sudo the command.

Now all you have to do is transfer the .tar.gz file to your archive.

TLDR
To backup thousands or files and directories from a website, you can use the tar command to archive it to a single file, compress it and download it to your archive.

Problem solving:
Really having permission issues?
Then you could sudo su - into the root user account, compress it with tar, and move it to your regular user directory. After which you can chown userx:userx file.tar.gz and then download it.

Not including hidden files?
Sometimes when you start tar within the directory you're backing up it won't include the .hidden files like .htaccess, make sure you're either not in the path or you're using a . (dot) at the end of the command: tar -czf blah.tar.gz /path/backup/ .
 
Last edited:
Top