Solved

rm huge backup file freezes web server

Posted on 2014-12-10
4
232 Views
Last Modified: 2014-12-15
Hi experts,

I found a problem on an Ubuntu-server (ext3) when deleting huge backup-files (50GB).
While deleting with "rm -r" I  the webserver apache will freeze for 1 minute.
I think it is because if the I/O load.

On a forum I found that using "ionice" would help.
So I could try using following:

ionice -c2 -n7 rm -r /path/to/the/backup_file

Open in new window


Do you think it is fine to use this?
Do you have an idea how long the deletion process would take? Without ionice it takes abot 4 minutes to delete a file with 50GB.

The goal is deleting the huge file while the server is running and the website should not be influenced.
0
Comment
Question by:Systemadministration
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 
LVL 7

Expert Comment

by:Thomas Wheeler
ID: 40493208
Sounds like your need to figure out whats going on with your disk. What type of setup is this? Is it one drive, a raid or an iscsi mount? Deleting this file should not cause this result. Are there errors in the logs?
0
 

Author Comment

by:Systemadministration
ID: 40493212
This seems to be a known problem:
http://serverfault.com/questions/480526/deleting-very-large-file-without-webserver-freezing

My question is only about ionice.
0
 
LVL 14

Accepted Solution

by:
Phil Phillips earned 500 total points
ID: 40493222
A better strategy might be to either store the backups on another server, or on another disk array.  That way, if you have to do disk-intensive operations, it won't affect the files that apache is trying to access.

That said, ionice should be fine as a work-around.  Another option would be to try truncating the file (which could be faster than the rm command):

> /path/to/the/backup_file

Open in new window


If you *really* want to get a feel for how long it will take, I recommend (if possible) testing it out first on a similar (but non-production) instance.
0
 

Author Closing Comment

by:Systemadministration
ID: 40500010
Great!
Truncation the 56GB file only took a few seconds. After that tha space was free again and I could delete the truncated file without risk.
0

Featured Post

Free learning courses: Active Directory Deep Dive

Get a firm grasp on your IT environment when you learn Active Directory best practices with Veeam! Watch all, or choose any amount, of this three-part webinar series to improve your skills. From the basics to virtualization and backup, we got you covered.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

How many times have you wanted to quickly do the same thing to a list but found yourself typing it again and again? I first figured out a small time saver with the up arrow to recall the last command but that can only get you so far if you have a bi…
Using 'screen' for session sharing, The Simple Edition Step 1: user starts session with command: screen Step 2: other user (logged in with same user account) connects with command: screen -x Done. Both users are connected to the same CLI sessio…
Learn several ways to interact with files and get file information from the bash shell. ls lists the contents of a directory: Using the -a flag displays hidden files: Using the -l flag formats the output in a long list: The file command gives us mor…
How to Install VMware Tools in Red Hat Enterprise Linux 6.4 (RHEL 6.4) Step-by-Step Tutorial
Suggested Courses

628 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question