Link to home
Start Free TrialLog in
Avatar of Sridhar Cherukuri
Sridhar CherukuriFlag for Tanzania, United Republic of

asked on

squidguard help

Hi i have a transparent squid running smoothly. I am trying to use squidguard with squid. I have follwed the instructions from squidguard.org. everything went well but the sites which are blocked are still opening.

when i tried the dry run
i am getting the below message
2008-11-10 22:15:57 [31912] New setting: dbhome: /usr/local/squidGuard/db
2008-11-10 22:15:57 [31912] New setting: logdir: /usr/local/squidGuard/log
2008-11-10 22:15:57 [31912] init domainlist /usr/local/squidGuard/db/porn/domains
2008-11-10 22:15:57 [31912] loading dbfile /usr/local/squidGuard/db/porn/domains.db
2008-11-10 22:15:57 [31912] init urllist /usr/local/squidGuard/db/porn/urls
2008-11-10 22:15:57 [31912] loading dbfile /usr/local/squidGuard/db/porn/urls.db
2008-11-10 22:15:57 [31912] squidGuard 1.3 started (1226344557.711)
2008-11-10 22:15:57 [31912] squidGuard ready for requests (1226344557.714)
2008-11-10 22:15:57 [31912] source not found
2008-11-10 22:15:57 [31912] no ACL matching source, using default

2008-11-10 22:15:57 [31912] squidGuard stopped (1226344557.717)


Is there anything wrong with the lines
2008-11-10 22:15:57 [31912] source not found
2008-11-10 22:15:57 [31912] no ACL matching source, using default

What could be the problem? please help.

Avatar of woolmilkporc
woolmilkporc
Flag of Germany image

Hi,
how did you install squidguard?
I heard somewhere it helped building squidguard from source and not installing it from rpm.
 
wmp
 
Did you configure any of the ACLs to match your internal IP addresses?
you need to specify where are your lists

see part of one of my squidGuard.conf

I added several comments to allow you to understand what is needed

hope it helps ;-)

 
#
# CONFIG FILE FOR SQUIDGUARD
#
# See http://www.squidguard.org/config/ for more examples
#
 
dbhome /var/lib/squidGuard/blacklists
logdir /var/log/squidGuard
 
# SOURCE IP's are needed. you can add them using a file with one ip per row. if your squid was not properly
#      configured, you maybe will need to specify sourceip/netmask, like in 192.168.10.219/255.255.255.255
src permit_level1 {
   iplist      /etc/trufirewall/forward_permit_level_1
}
 
src permit_level2 {
   iplist      /etc/trufirewall/forward_permit_level_2
}
 
# Now Destinations that will be blocked. add as many as you want
dest ads {
    log        ads
    domainlist    ads/domains
    urllist        ads/urls
}
 
dest aggressive {
    log        aggressive
    domainlist    aggressive/domains
    urllist        aggressive/urls
}
 
# At the end, the rules. WHO can access WHAT? here:
acl {
    permit_level1 {
        pass all
    }
    permit_level2 {
      pass local-ok !local-block !aggressive !drugs !gambling !hacking !porn !proxy !violence !warez all
      redirect 302:http://ip.address.internal.server/cgi-bin/squidGuard?clientaddr=%a&clientname=%n&clientident=%i&srcclass=%s&targetgroup=%t&url=%u
    }

Open in new window

Avatar of Sridhar Cherukuri

ASKER

Hi Redimido, Thanks for the suggestion. I have used the example file from the squidguard.org site. I did not get any errors while compiling below is my file i dont understand what is dmz in the config file. After compiling it i ran a dry run
echo "http://www.sex.com 192.168.1.111/ - - GET" | squidGuard -c /usr/local/squidGuard/squidGuard.conf -d

Result

2008-11-12 17:32:35 [4966] New setting: dbhome: /usr/local/squidGuard/db
2008-11-12 17:32:35 [4966] New setting: logdir: /usr/local/squidGuard/log
2008-11-12 17:32:35 [4966] Added User: root
2008-11-12 17:32:35 [4966] Added User: foo
2008-11-12 17:32:35 [4966] init domainlist /usr/local/squidGuard/db/porn/domains
2008-11-12 17:32:35 [4966] loading dbfile /usr/local/squidGuard/db/porn/domains.db
2008-11-12 17:32:35 [4966] init urllist /usr/local/squidGuard/db/porn/urls
2008-11-12 17:32:35 [4966] loading dbfile /usr/local/squidGuard/db/porn/urls.db
2008-11-12 17:32:35 [4966] squidGuard 1.3 started (1226500355.464)
2008-11-12 17:32:35 [4966] Info: recalculating alarm in 10645 seconds
2008-11-12 17:32:35 [4966] squidGuard ready for requests (1226500355.485)
http://www.google.com 192.168.1.111/- - -


echo "http://www.example.com 192.168.1.111/ - - GET" | squidGuard -c /usr/local/squidGuard/squidGuard.conf -d

Result
2008-11-12 17:38:01 [4993] New setting: dbhome: /usr/local/squidGuard/db
2008-11-12 17:38:01 [4993] New setting: logdir: /usr/local/squidGuard/log
2008-11-12 17:38:01 [4993] Added User: root
2008-11-12 17:38:01 [4993] Added User: foo
2008-11-12 17:38:01 [4993] init domainlist /usr/local/squidGuard/db/porn/domains
2008-11-12 17:38:01 [4993] loading dbfile /usr/local/squidGuard/db/porn/domains.db
2008-11-12 17:38:01 [4993] init urllist /usr/local/squidGuard/db/porn/urls
2008-11-12 17:38:01 [4993] loading dbfile /usr/local/squidGuard/db/porn/urls.db
2008-11-12 17:38:01 [4993] squidGuard 1.3 started (1226500681.269)
2008-11-12 17:38:01 [4993] Info: recalculating alarm in 10319 seconds
2008-11-12 17:38:01 [4993] squidGuard ready for requests (1226500681.271)

2008-11-12 17:38:01 [4993] squidGuard stopped (1226500681.332)

i have cross checked the redirect_program also every thing seems fine, but when i try to access the blocked sites nothing is getting blocked. can you have a look at the config file and suggest me please?

#
# CONFIG FILE FOR SQUIDGUARD
#
 
dbhome /usr/local/squidGuard/db
logdir /usr/local/squidGuard/log
 
#
# TIME RULES:
# abbrev for weekdays:
# s = sun, m = mon, t =tue, w = wed, h = thu, f = fri, a = sat
 
time workhours {
        weekly mtwhf 08:00 - 20:30
        date *-*-01  08:00 - 20:30
}
 
#
# REWRITE RULES:
#
 
rew dmz {
        s@://admin/@://admin.foo.bar.de/@i
 s@://foo.bar.de/@://www.foo.bar.de/@i
}
 
#
# SOURCE ADDRESSES:
#
 
src admin {
        ip              1.2.3.4 1.2.3.5
        user            root foo
        within          workhours
}
 
src foo-clients {
        ip              192.168.1.111
}
 
 
#
# DESTINATION CLASSES:
#
 
dest porn {
                              domainlist      porn/domains
        urllist         porn/urls
        redirect        http://www.google.com
}
 
 
acl {
        admin {
                pass     any
        }
 
        foo-clients within workhours {
                pass     !porn any
        } else {
                pass any
        }
 
 
        default {
                pass     none
                rewrite  dmz
                redirect http://www.yahoo.com
        }
}                                                 

Open in new window

Hi

I would delete any reference to DMZ since it is just a redirection.

you may know not all sites are included into the squidguard lists, so I usually add a local block and local permitted sites to the rule, like in
RESENT COMPLETE:

Hi

I would delete any reference to DMZ since it is just a redirection.

you may know not all sites are included into the squidguard lists, so I usually add a local block and local permitted sites to the rule, like in


One more tough: change the "else" part to pass none, to identify if the times you put are a problem
I ran some time ago in a problem where the times were considered in UTC, so I had to change the workhours acl accordingly.
#
# CONFIG FILE FOR SQUIDGUARD
#
dbhome /usr/local/squidGuard/db
logdir /usr/local/squidGuard/log
 
#
# TIME RULES:
# abbrev for weekdays:
# s = sun, m = mon, t =tue, w = wed, h = thu, f = fri, a = sat
time workhours {
        weekly mtwhf 08:00 - 20:30
        date *-*-01  08:00 - 20:30
}
 
 
#
# SOURCE ADDRESSES:
# 
src admin {
        ip              1.2.3.4 1.2.3.5
        user            root foo
        within          workhours
}
 
src foo-clients {
        ip              192.168.1.111
}
 
#
# DESTINATION CLASSES:
# 
dest porn {
        domainlist      porn/domains
        urllist         porn/urls
}
dest local-block {
        domainlist      local-block/domains
        urllist         local-block/urls
}
dest local-permit {
        domainlist      local-permit/domains
        urllist         local-permit/urls
}
 
acl {
        admin {
                pass     any
        }
 
        foo-clients within workhours {
                pass local-permit !local-block !porn any
                redirect http://www.yahoo.com
        } else {
                pass none
                redirect http://www.yahoo.com
        }
 
        default {
                pass     none
                redirect http://www.yahoo.com
        }
}

Open in new window

I have sorted the issue already, i did not have the write permissions to the log folder hence i am getting the error.

i used tail -f /var/log/squid/cache.log and restarted the squid then i came to know the issue in it.
great, I did not count squid could not be already configured.
ASKER CERTIFIED SOLUTION
Avatar of Computer101
Computer101
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial