Hi Experts,
I have a log file which gets appended every day by a program.
FTPerror.log
I want to keep backup of this log file for 60 days using shell script on Linux, the log file is only one and not by date.
Time stamp will get updated of log file when ever it gets updated.
How do I keep backup of this log file for 60 days?
Another log file is inside folder and everyday new log files get copied to this folder, how do I keep backup of this for 60 days using shell script on Linux?
Thanks in Advance!
Translating the data into an easily accessible version I.e. Converting the log data into a searchable DB might simplify the reporting as well as managing the deletion I.e. Delete from table entries 60 days prior to current day.
Another option is to configure logrotate to rotate the log when it reaches a specific size 500MB then based on your current 60 day size, include the counter to keep 60 days, and compress to zip the rotated files this way you can reduce the consumed space. And the rotation of the logs will delete older log files.
Check whether the program needs to be sent a signal HUP, etc. to make sure it creates a new file after the rotation as well as reattached .......