I have a requirement to compress the Oracle database archive log files having the format 'arch_%t_%s.arc' and transfer it to another server, both running on Linux. I am planning to do it like this -
1. copy the archive logfiles to another location in the same server using rsync.
2. compress the logfiles from that location. A script will run every 1 hour and compress all the uncompressed files using 'tar'.
3. transfer the compressed logfiles using SCP to the another server. (for every 1 hour).
But the big question at the 2nd & 3rd step how it'll identify the files which have already been compressed(and copied) and which are not. Please let me know if there is any other way too to achieve this.
I have written articles previously comparing SARDU and YUMI. I also included a couple of lines about Easy2boot (easy2boot.com). I have now been using, and enjoying easy2boot as my sole multiboot utility for some years and realize that it deserves …
This article will show you step-by-step instructions to build your own NTP CentOS server. The network diagram shows the best practice to setup the NTP server farm for redundancy. This article also serves as your NTP server documentation.