Solved

cleaning log files

Posted on 2006-07-13
8
2,248 Views
Last Modified: 2012-06-27
Hello

My /var is full. I need to list all the files with size attributes to clean unnecessary files manually

Filesystem           1K-blocks      Used Available Use% Mounted on
/dev/sda3              4032124   4032120         0 100% /var

I need to > file.txt  the whole putput but dunno which linux command to use ?

Any help ?

Thank you. It is urgent so I m giving max points

thank you
tg
0
Comment
Question by:tgunduz
8 Comments
 
LVL 8

Accepted Solution

by:
Autogard earned 100 total points
ID: 17103563
You mean "ls -lh /var > file.txt"?
0
 
LVL 34

Assisted Solution

by:Duncan Roe
Duncan Roe earned 100 total points
ID: 17103606
This is the command script I use to find large directories:

06:54:24$ cat `type -p vss`
#!/bin/sh
find . -mount -type d -mindepth 1 -maxdepth 1 -exec du -s "{}" \;

It shows the largest dir immediately below (at a guess, that will be log in your case if you run in /var)
cd to that large directory and repeat.
When vss doesn't show the size you saw in the previous level, you're in the dir with big files. Use "ls -lSr" to show the largest files (small ones will scroll off the screen:)
If they're system log files, best cat /dev/null > them. After you've cleaned up the mess, implement file size limioting in your logrotate config file.
0
 
LVL 4

Assisted Solution

by:bytta
bytta earned 100 total points
ID: 17108566
A simpler way do do the same (works on both files and dirs, unless hidden, like: /var/.name )

du -sc /var/* > file.txt
sort -nr file.txt | head -20 #show 20 largest files

cd to that large directory and repeat.
-c adds "total" disk space in the end of du output, which can help on subdirs

For ALL files and folders (errors ignored):
cd /var
du -sc * .[^.]* 2>/dev/null
0
Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

 
LVL 16

Assisted Solution

by:xDamox
xDamox earned 100 total points
ID: 17110092
Hi,

For cleaning your logs I would recommend editing the /etc/logrotate.conf file were you can have your logs compressed and
rotated daily, weekly, etc.
0
 
LVL 40

Assisted Solution

by:noci
noci earned 100 total points
ID: 17121742
If you don't have logrotate then you can probably find it with your distro, and otherwise with:
it has a good description at:

http://kavlon.org/index.php/logrotate
0
 
LVL 40

Expert Comment

by:noci
ID: 17121752
0

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Over the last ten+ years I have seen Linux configuration tools come and go. In the early days there was the tried-and-true, all-powerful linuxconf that many thought would remain the one and only Linux configuration tool until the end of times. Well,…
Join Greg Farro and Ethan Banks from Packet Pushers (http://packetpushers.net/podcast/podcasts/pq-show-93-smart-network-monitoring-paessler-sponsored/) and Greg Ross from Paessler (https://www.paessler.com/prtg) for a discussion about smart network …
Learn how to find files with the shell using the find and locate commands. Use locate to find a needle in a haystack.: With locate, check if the file still exists.: Use find to get the actual location of the file.:
Connecting to an Amazon Linux EC2 Instance from Windows Using PuTTY.

829 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question