Initial Backup
Now we take an initial backup of your site. During the process, you’ll be able to view real-time updates on its progress.
The initial website or database backup is a complete data retrieval of all files that CodeGuard has access to. Depending on the number of files and total size, the initial backup can take up to 72 hours. After this first backup, future backups are differential, both in the files that are transferred and the files that are stored on your behalf. CodeGuard relies upon a queueing system and our backup process is not unlike FedEx package pickup and delivery. Our process is so similar, in fact, that we modeled our user experience after their process.
The four main steps of the initial backup are (i) process initiation (ii) file pick up (iii) file transit, and (iv) final delivery. Process initiation contains verifying the credentials and transmitting them to CodeGuard. File pick up begins with an analysis of the file structure and creation of a git repository within Amazon’s Elastic Compute Cloud (EC2). The transit process begins after the file structure has been analyzed and the list of the files to be transmitted is finalized.
Files are then transferred to EC2. After all files are transferred to EC2, they are committed to the git repository on EC2, but do not remain there for long. As soon as the git commit is finished, the files are compressed and sent to Amazon Simple Storage Service (S3), where they are encrypted using industry-leading AES 256 bit techniques. The last step of process is the deletion of the files from their temporary storage on EC2.
To know what plan is best for you, prices, etc. Please contact us.
Detailed Walkthrough
Static Content (FTP/SFTP)
We capture all file content and the process can take up to two days depending on the size of your site, number of files on your site, and the wait in the queue. All subsequent backups will not take as long or be as resource intensive.
- Test the connection to the site using the same protocol (FTP, SFTP) that backup will use
- Create a Git repository on the local server instance (i.e. EC2)
- Build a list of files for the site
- Add all of those files to the download queue
- Download each file to the Git repository (on the local machine)
- Commit all the downloaded files to the Git repository
- Create a tarred and gzipped archive of the Git repository
- Upload the archive to Amazon S3 which simultaneously encrypts the archive and all of its content
- Build the file mix
- Record the statistics from the backup
To know what plan is best for you, prices, etc. Please contact us.
Dynamic Content Overview (FTP/SFTP)
The database backups are stored as flat text files and the content of the text files is a list of executable SQL statements which, when run, will recreate the database in it’s entirety. The tables are stored this way so that when a user initiates a restore, no transformation of the data is needed. It can be run as-is to restore the database.
We connect and export the entire contents of the database using mysqldump. This provides us with the database schema (the list of all of the database tables and the columns they contain). The content of the database tables is then transformed into a format that allows us to use version control (Git) to track the changes made to your data. The specific flow of the process is as follows. The process is sequential and a failure at any step will stop the process:
To know what plan is best for you, prices, etc. Please contact us.