Unstoppable Domains

Large Mysql database backup solutions

Spaceship Spaceship
Watch

Jawn

Straight from SwedenEstablished Member
Impact
2
Heya,
Im looking for a finished backup solution. I currently run two dedicated servers with pretty big databases.
I currently have databases up to 400mb and i need a program/code that downloads large mysql databases to my computer.

I found a php script which takes a mysqldump and uploads it to a FTP but id guess it would timeout pretty fast.

So im wondering if someone knows something that would do this for me.

Best Regards
Jawn
 
0
•••
The views expressed on this page by users and staff are their own, not those of NamePros.
GoDaddyGoDaddy
Best would be probably to use MySQL's mysqldump. The resulting SQL script could be ideally zipped and then transferred wherever you want.

I could see potential problems during the backup process as simultaneous database access could take place, as well as in the lengthy transfer.

If you really want to do it via a PHP script you'd need to ensure the code doesnt run with time limits.
 
0
•••
If you have access to your mysql server, you could do an initial dump (while locking your DB) and enable and use the binary logs. When done, you can backup once your dump and then generated binary logs after rotation.

It may take some time to restore your system, but at least you database is kept.

Here's an example:
1. Prepare mysql server to enable binary logs (if not done) in my.cnf:
server-id = 1
log_bin = /var/log/mysql/mysql-bin.log
max_binlog_size = 100M

This will create some files in /var/log/mysql/:
mysql-bin.000001
mysql-bin.000002
mysql-bin.000003
...
mysql-bin.index

Then restart mysql. (you may want to limit binary_logs to only one database).

2. execute the following mysql command:
FLUSH TABLES WITH READ LOCK;

3. dump your database. While this time, delete the binary log files.

4. execute the following mysql commands:
FLUSH LOGS;
UNLOCK TABLES;

From this point on it will keep only your dump and binary dumps taken from your dump.

I usually do a dump every 3 months and then archive the binary logs daily - which avoids me to transfer 5 GB of database dumps daily, but takes three hours to regenerate the database on a server.

HTH,
 
0
•••
Appraise.net

We're social

Unstoppable Domains
Domain Recover
DomainEasy — Zero Commission
  • The sidebar remains visible by scrolling at a speed relative to the page’s height.
Back