• Welcome to Simple Machines Community Forum. Please login or sign up.
December 08, 2021, 08:58:14 AM

News:

Want to get involved in developing SMF, then why not lend a hand on our github!


Script for automatic back up of database for cron jobs?

Started by Farmacija, February 16, 2007, 12:24:07 PM

Previous topic - Next topic

Farmacija

does anybody have such a script which is propriate for backing up smf database in exactly period of time?
smf 1.1.2
linux server  
www.farmaceuti.com
www.farmaceuti.com/tekstovi

Daniel15

After the backup is done, what should it do with the backup? Do you want it emailed to you, stored in your home directory, uploaded to an external server, or something else?
Daniel15, former Customisation team member, resigned due to lack of time. I still love everyone here :D.
Go to smfshop.com for SMFshop support, do NOT email or PM me!

Farmacija

www.farmaceuti.com
www.farmaceuti.com/tekstovi

Daniel15

February 17, 2007, 12:26:11 AM #3 Last Edit: February 17, 2007, 12:28:02 AM by Daniel15
Quotedaniel, i want to store it in my home directory
OK, first, create a new database username and password (in the MySQL section of cPanel, under "Current Users:"). Then, under "Add Users To Your Databases:", add the user to your database (for security reasons, only give it the "SELECT" permission).

Then, in the Cron section of cPanel, create a cron job with the following command:
mysqldump -ucpuser_username -ppassword cpuser_database > /home/cpuser/backup.sql

cpuser_username: Database username (the one you created earlier)
cpuser_password: Database password
cpuser: cPanel username
backup.sql: File to store backup in
As an example:
mysqldump -udaniel15_backup -pOIHsa3dfa23SDGa daniel15_test > /home/daniel15/backup.sql

If you want to compress the backup using gzip, try a command line like this:
mysqldump -ucpuser_username -ppassword cpuser_database | gzip > /home/cpuser/backup.sql.gz

I haven't tried this, but it should work :)
Daniel15, former Customisation team member, resigned due to lack of time. I still love everyone here :D.
Go to smfshop.com for SMFshop support, do NOT email or PM me!

Mayhem30

... or you can just download AutoMySQLBackup script and add that to cron.

http://sourceforge.net/project/showfiles.php?group_id=101066

A script to take daily, weekly and monthly backups of your MySQL databases using mysqldump. Features - Backup mutiple databases - Single backup file or to a seperate file for each DB - Compress backup files - Backup remote servers - E-mail logs - More..

I use it and it works great!  :)

Farmacija

www.farmaceuti.com
www.farmaceuti.com/tekstovi

Farmacija

i've got this error

mysqldump: Got error: 1045: Access denied for user 'xxxxxx'@'localhost' (using password: YES) when trying to connect
www.farmaceuti.com
www.farmaceuti.com/tekstovi

Daniel15

Quote from: Farmacija on April 12, 2007, 11:25:46 AM
i've got this error

mysqldump: Got error: 1045: Access denied for user 'xxxxxx'@'localhost' (using password: YES) when trying to connect
That means the the username and password you used was incorrect. Double-check the settings you specified.
Daniel15, former Customisation team member, resigned due to lack of time. I still love everyone here :D.
Go to smfshop.com for SMFshop support, do NOT email or PM me!

Farmacija

Quotemysqldump: Got error: 1044: Access denied for user 'farmaceu_blabla'@'localhost' to database 'farmaceu_wrdp2' when using LOCK TABLES
www.farmaceuti.com
www.farmaceuti.com/tekstovi

Daniel15

Make sure that user has full permissions on that database. You'll need to grant "ALL" permissions when adding the user to the database.
Daniel15, former Customisation team member, resigned due to lack of time. I still love everyone here :D.
Go to smfshop.com for SMFshop support, do NOT email or PM me!

Farmacija

you said above  that i should give it just select permission!!!!
www.farmaceuti.com
www.farmaceuti.com/tekstovi

Daniel15

Sorry about that... I thought SELECT would cover everything, but it looks like it tries to LOCK the tables. SELECT worked for me using MySQL 4.1, but I guess it doesn't work like this for all versions.
Daniel15, former Customisation team member, resigned due to lack of time. I still love everyone here :D.
Go to smfshop.com for SMFshop support, do NOT email or PM me!

Farmacija

No one is perfect :)
and where is saved that back up? i can't find it in home directory.........
www.farmaceuti.com
www.farmaceuti.com/tekstovi

Daniel15

The > /home/daniel15/backup.sql at the end tells it where to write the file to. I assume you changed this to suit your setup? You'll need to look in your home directory; it won't be in the "public_html" directory...
Daniel15, former Customisation team member, resigned due to lack of time. I still love everyone here :D.
Go to smfshop.com for SMFshop support, do NOT email or PM me!

Farmacija

www.farmaceuti.com
www.farmaceuti.com/tekstovi

Farmacija

how we can do this for several databases on 1 server?
www.farmaceuti.com
www.farmaceuti.com/tekstovi

Daniel15

I haven't tried this, but just came up with a script that should do this properly. First, make a file called "database-backup.php" in your home directory, and put something like this in it:

#!/usr/bin/php -q
<?php
// Where do we want to store the backups?
$output '/home/username/backups';

// Some MySQL stuff
$host 'localhost';
$username 'mysql_username';
$password 'mysql_password';

$databases = array(
'database1',
'database2',
// ... add more if you need to, like this:
// 'database name',
);

foreach (
$databases as $database)
{
exec('mysqldump -u' $username ' -p' $password ' ' $database ' | gzip > ' $output '/' $database '.sql.gz');
}
?>



Create a "backups" directory to store the backups (make sure $output in the script above is correct!), and then change the cron job to something like:
/home/username/database-backup.php

Hope this works for you :)
Daniel15, former Customisation team member, resigned due to lack of time. I still love everyone here :D.
Go to smfshop.com for SMFshop support, do NOT email or PM me!

SleePy

I use one similar to this


<?php
// Ok.. Get our vars in here.
$db['host'] = 'localhost';
$db['user'] = 'root';
$db['pass'] = 'access';
$db['goto'] = 'you@domain.com';
$db['from'] = 'cron@site.com';
$db['date'] = date("Y-m-d");
$db['temp'] = '/home/cpaneluser/tmp';

// What to call this sucker.
$mailfile 'FULL-backup-'.$db['date'].'.tar.gz';

// MySQL we need ya..
$link mysql_connect($db['host'], $db['user'], $db['pass']);
if(!
$link)
die ('Unable to connect to mysql server.. Could not Create Database Backup.');

// Result it
$result mysql_query('show databases') or die('Couldn\'t Show DB:' mysql_error());

// For each database put it into a nice array for use.
while($row mysql_fetch_array($result))
$databases[] = $row['Database'];

// Foreach of the databases we dump it
foreach($databases AS $dbs)
passthru("mysqldump -u " $db['user'] . " --password=" $db['pass'] . " " $dbs " > " $db['temp'] . '/' $dbs ".sql");

// Now we compress the whole temp folder.. wow...
@passthru('tar -czf '.$mailfile.' '.$db['temp'].'  -p > /dev/null 2>&1');

// Deletion time..
foreach($databases AS $dbs)
if(file_exists($db['temp'].'/'.$dbs.'.sql'))
unlink($db['temp'].'/'.$dbs.'.sql');

// The Fun crap.. Mail preperation..
$attachmentname array_pop(explode("/"$mailfile));   // If a path was included, strip it out for the attachment name
$message "Compressed database backup file $attachmentname attached.";
$mime_boundary "<<<:" md5(time());
$data chunk_split(base64_encode(implode(""file($mailfile))));
$headers "From: ".$db['from']."\r\n";
$headers .= "MIME-Version: 1.0\r\n";
$headers .= "Content-type: multipart/mixed;\r\n";
$headers .= " boundary=\"".$mime_boundary."\"\r\n";
$content "This is a multi-part message in MIME format.\r\n\r\n";
$content .= "--".$mime_boundary."\r\n";
$content .= "Content-Type: text/plain; charset=\"iso-8859-1\"\r\n";
$content .= "Content-Transfer-Encoding: 7bit\r\n\r\n";
$content .= $message."\r\n";
$content .= "--".$mime_boundary."\r\n";
$content .= "Content-Disposition: attachment;\r\n";
$content .= "Content-Type: Application/Octet-Stream; name=\"$attachmentname\"\r\n";
$content .= "Content-Transfer-Encoding: base64\r\n\r\n";
$content .= $data."\r\n";
$content .= "--" $mime_boundary "\r\n";

// Now we finally mail it..
mail($db['goto'], 'MySQL backup file - ALL'$content$headers'-f'.$db['from']);

// Delete the big backup we just made..
unlink($mailfile);
?>
Jeremy D — Site Team / SMF Developer
Support the SMF Support team!
Profiles:GitHub
Join us on IRC Libera.chat/#smf

Farmacija

hm, which command then i should put in cron jobs
this mysqldump -ucpuser_username -ppassword  cpuser_database > /home/cpuser/backup.sql ?
www.farmaceuti.com
www.farmaceuti.com/tekstovi

SleePy

You could do that if you want to do it for one database or one for each database.
If you are using the php script you do

php -q /home/cpuser/path/to/phpscript.php
Jeremy D — Site Team / SMF Developer
Support the SMF Support team!
Profiles:GitHub
Join us on IRC Libera.chat/#smf

Advertisement: