So we have 544 Linux machines
They all send their audit.log to the syslog server
Some of the logs are getting quite big and need to be tarred/compressed and stored on the backup sever to be backed up.
So I wrote a script that tells me what folders are over 1 gig
du -h --max-depth=1 /mnt/rsyslog/ | grep '[0-9]G\>' | sort -hr >/mnt/rsyslog/audit/Over1G.txt
That gives me a listing of folders that are over 1G and outputs to the text file
Currently I am having to go to each folder and tar the file to the backup server, remove the file, then remove the directory.
repeat steps for next machine
Is there a way to write a script that will do what I do via the script?
”The time we save is the biggest benefit of E-E to our team. What could take multiple guys 2 hours or more each to find is accessed in around 15 minutes on Experts Exchange.
-Mike Kapnisakis, Warner Bros
With your subscription - you'll gain access to our exclusive IT community of thousands of IT pros. You'll also be able to connect with highly specified Experts to get personalized solutions to your troubleshooting & research questions. It’s like crowd-sourced consulting.
We can't always guarantee that the perfect solution to your specific problem will be waiting for you. If you ask your own question - our Certified Experts will team up with you to help you get the answers you need.
Our certified Experts are CTOs, CISOs, and Technical Architects who answer questions, write articles, and produce videos on Experts Exchange. 99% of them have full time tech jobs - they volunteer their time to help other people in the technology industry learn and succeed.
We can't guarantee quick solutions - Experts Exchange isn't a help desk. We're a community of IT professionals committed to sharing knowledge. Our experts volunteer their time to help other people in the technology industry learn and succeed.
Our community of experts have been thoroughly vetted for their expertise and industry experience.