[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Mysqldump backup script and crontab error

Posted on 2014-01-10
3
Medium Priority
?
727 Views
Last Modified: 2014-01-15
Hi,

I've got the following mysqldumb backup scritpt:

#/bin/sh
mysqldump -u username -pmypass --all-databases > DB.sql
gzip DB.sql
NOW=$(date +"mysql-backup-%H%M-%d-%m-%y")
mv DB.sql.gz /home/user/BACKUPS/"$NOW.gz"

It works fine when I start it manually. When i put it in crontab, I get the following error:

mysqldump: Couldn't execute 'SHOW TRIGGERS LIKE 'data\_29'': Got error 28 from storage engine (1030)

I reasearched that error 28 means there is not enough storage. That was the case but i free up some space and I'm still getting this error. Also under crontab the backup file is nearly half the size.
0
Comment
Question by:jackal077
3 Comments
 
LVL 9

Accepted Solution

by:
Red-King earned 2000 total points
ID: 39771644
Which user's crontab are you using to run the script, root or your own user?

I noticed you are missing the '!' from the first line of your script, it should be #!/bin/sh

If  you are running the script using root's crontab the working directory will be the root users home directory i.e. /root
The DB.sql file will be created in the /root directory so you will need to be sure that you have enough space there to save the dump file.

If you want to change the working directory you can simply add a line to change the directory at the start of the script i.e.

#!/bin/sh
NOW=$(date +"mysql-backup-%H%M-%d-%m-%y")

cd /home/user/BACKUPS
mysqldump -u username -pmypass --all-databases > DB.sql
gzip DB.sql
mv DB.sql.gz ./"$NOW.gz"

Open in new window


I imagine that the backup is half the size as it is not completing.
I expect you can tell gzip to output with a specific filename to cut out the extra mv command.
I think the tar command would be (double check the syntax as this is off the top of my head);

tar -czf ./"$NOW.tar.gz" DB.sql
0
 
LVL 26

Expert Comment

by:Tomas Helgi Johannsson
ID: 39773426
Hi!

To take a backup and compress the file in one line of code do this
#!/bin/sh
backfile=$(date +"mysql-backup-%H%M-%d-%m-%y")
myuser=yourusername
mypass=yourpassw
cd /home/user/BACKUPS
mysqldump -u $myuser -p$mypass --all-databases | gzip -9 > $backfile.sql.gz

Open in new window


This will compress your backup as much as possible on the fly and save storage.

Regards,
      Tomas Helgi
0
 

Author Comment

by:jackal077
ID: 39781959
Hi,
yes, I must have missed the !. Cron runs under root. I didn't realize I had to define the path inside the script. That was it. Thx.
0

Featured Post

Upgrade your Question Security!

Add Premium security features to your question to ensure its privacy or anonymity. Learn more about your ability to control Question Security today.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Introduction This article is intended for those who are new to PHP error handling (https://www.experts-exchange.com/articles/11769/And-by-the-way-I-am-New-to-PHP.html).  It addresses one of the most common problems that plague beginning PHP develop…
This article shows the steps required to install WordPress on Azure. Web Apps, Mobile Apps, API Apps, or Functions, in Azure all these run in an App Service plan. WordPress is no exception and requires an App Service Plan and Database to install
In this video, Percona Solution Engineer Dimitri Vanoverbeke discusses why you want to use at least three nodes in a database cluster. To discuss how Percona Consulting can help with your design and architecture needs for your database and infras…
In this video, Percona Solution Engineer Rick Golba discuss how (and why) you implement high availability in a database environment. To discuss how Percona Consulting can help with your design and architecture needs for your database and infrastr…
Suggested Courses
Course of the Month18 days, 8 hours left to enroll

825 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question