Link to home
Start Free TrialLog in
Avatar of bryanlloydharris
bryanlloydharris

asked on

squid not working

I am trying to configure a squid accelerator(a reverse proxy), but it's not working.  I have a web server and a squid server like this.

10.0.0.2         web1 (the apache)
10.0.0.3         s2 (the squid)

I can visit http://web1/ in my browser, but when I visit http://s2/ it says this:

While trying to process the request:

GET / HTTP/1.1
Host: s2
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.7) Gecko/20060909 Firefox/1.5.0.7
Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Cache-Control: max-age=0

The following error was encountered:

    * Invalid Request

Here is my config.

visible_hostname s2

http_port 80
defaultsite=web1

hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log /usr/local/squid/var/logs/access.log squid
refresh_pattern ^ftp:           1440    20%     10080
refresh_pattern ^gopher:        1440    0%      1440
refresh_pattern .               0       20%     4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl Safe_ports port 80          # http
acl Safe_ports port 21          # ftp
acl Safe_ports port 443 563     # https, snews
acl Safe_ports port 70          # gopher
acl Safe_ports port 210         # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280         # http-mgmt
acl Safe_ports port 488         # gss-http
acl Safe_ports port 591         # filemaker
acl Safe_ports port 777         # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access allow CONNECT localhost
http_access allow all
http_reply_access allow all
icp_access allow all
cache_effective_user nobody
cache_effective_group nobody
cache_dir aufs /usr/local/squid/var/cache1 100 16 256
cache_dir aufs /usr/local/squid/var/cache2 800 16 256

Can someone point me in the right direction for setting up squid for a first time user?  I am having trouble with the new syntax being used after the changes in squid 2.6.  Should I just use 2.4?
Avatar of ravenpl
ravenpl
Flag of Poland image

A small guide can be found http://www.deckle.co.za/squid-users-guide/Accelerator_Mode
> 10.0.0.2         web1 (the apache)
> 10.0.0.3         s2 (the squid)
> I can visit http://web1/ in my browser, but when I visit http://s2/ it says this:
Why You want visit http://s2/ ? Is the squid itself serving some webpages? No, therefore it will show You an error.
The Q is, whether visiting http://web1/ goes throught the squid or not.
Avatar of bryanlloydharris
bryanlloydharris

ASKER

I want to create the squid as an accelerator squid, not a regular squid.  That's the reason I go to s2 instead of web1.

For an accelerator, it tries to help a slow web server by saving the pages.  If it doesn't have the page saved, it gets the page from the real web server.  This is why I go to http://s2/ instead of http://web1/.
But then again, I could be doing this wrong..
Have you read the guide?
squid in acceleration mode does not serve any webpages anyway. Therefore it always acts as transparent proxy(it mimics the server, but requires valid URL).
So usually it works that way: client requests http://web1/ but gets redirected to squid. Squid serves as many as it from cache, then connects to real server and caches(if possible) the answer. If squid works as accelerator for one server, You may put the serever name in it's config, otherwise(many servers) squid uses Host: value from http request.
So You see, You can't put in Your browser http:/s2/ url, as this is invalid webpage name.
"So usually it works that way: client requests http://web1/ but gets redirected to squid."

Are you saying client connects to web1 but web1 forwards the packets to the squid?  Wouldn't I need to setup iptables to forward the packets with a firewall?
Ah yes I've read the guide but I think I'm missing something.  I thought it worked like this:

client -> squid -> webserver

But are you saying it works like the following?

client -> webserver -> squid
    '~----------------------> squid
In fact it works: client -> squid -> webserver
But it's transparent to client. Client thinks that there is no squid on the way - that's why it puts http://web1/ in it's browser.

> Wouldn't I need to setup iptables to forward the packets with a firewall?
Yes!
What more, usuall configuration is:
internet ---- firewall with global IP with squid that intercepts any http requests ---- farm of servers on local IPs
"In fact it works: client -> squid -> webserver"
That's how I'm trying I think.  But I guess I need to rename the stuff in /etc/hosts so it's web1 instead of s2?
ASKER CERTIFIED SOLUTION
Avatar of ravenpl
ravenpl
Flag of Poland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Maybe that's why it's not working...