Link to home
Start Free TrialLog in
Avatar of StevePimer
StevePimerFlag for United States of America

asked on

How do I access a router behind a firewall for remote management?

I have a client site where we are setting up a VPN tunnel using a Linksys BEFVP41 router directly off the cable modem.  Then behind that I have a Linksys WRT54G router setup to allow wireless connectivity.  I can perform remote management functions on the BEFVP41 router by entering the external IP address of the router and :8080 (Such as http:\x.x.x.x:8080).  I then get the web based software management screen where I enter my user name and password.  All works well.

But I also need to perform remote management on the WRT54G wireless router behind the BEFVP41 VPN router.  I assumed that if I changed the remote management port on the WRT54G router to 8081 I could access by entering the external IP number of the BEFVP41 router (such as http:\x.x.x.x:8081).  No success.  I even tried entering IP forwarding on the BEFVP41 router table to route any port 8081 activity to the internal LAN address of the WRT54G router (192.168.1.2).  The BEFVP41 router uses 192.168.1.1 address.  Again, no success.

Can anyone tell me how to access the WRT54G router for remote management functions in this configuration?  Thanks.
Avatar of GinEric
GinEric

Basically, you've put both routers as the Broadcast source for one gateway.  Broadcast is on 192.168.1.255 and you've given that to both routers by assigning them 192.168.1.1 and 192.168.1.2

That's a mistake.

First, make the second router 192.168.2.1  note where the "2" is.

Now, you can access the second router, providing you specify the proper gateway on one of your machines.  The first router's gateway is 192.168.1.1 and the second router's gateway is 192.168.2.1

With two machines, preferably, assign two different IP's as machine one: 192.168.1.x where x is something like 47 or 100, and assign the second machine 192.168.2.x again where x is one of those numbers, this depends on the initial settings of the router, some start at 47 [usually the 192.168.2.1 router] while other start at 100 [usually the 192.168.1.1 router].  There is also another range for wireless; you'll have to find out from the specific model what the range is.  When both routers can be logged into, then you begin the real work.  The problem here is to get the two gateways to communicate.  Most home routers have real problems in Class C operation, that is, when the second octet of the IP is different as in 192.168.1.1 and 192.168.2.1  the "1" and "2" in the second octet from your right is the problem for home routers.

On our network, we have assigned 192.168.1.106 for an XP Home machine with gateway 192.168.2.101 which is a Linux Server ethernet card.  Basically, the XP Home box sees the Linux box as the gateway and it is connected to Router2 whose WAN address is 192.168.2.101 [gateway 192.168.1.1] and whose LAN IP is 192.168.1.1 while the other Router3 has Internet Connection static IP 192.168.1.1 [gateway 192.168.1.1].  As you can see, it's not a simple setup to get XP Home to work.  The Linux Server is plugged into Router2 on one NIC and the Internet on the other NIC.  The Linux Server acts as both the Server and the firewall between the Intranet and the Internet.

Some will suggest DMZ and other methods; they are not secure.

To route between two routers and the Internet/Intranet, you must make the routers unique in their broadcast range.

The second problem is getting everybody to talk to each other.  Currently, we use Apache for this, as it works better than the other methods such as Samba and various forms of login.  Users simply go to either the Internet Web Site or the Intranet Web Site.  We don't use port 8080 at all on the Internet server, as it is more of an interference than a safeguard.  Port 80 is fine for the  Apache Web Server on the Linux Web Server.  For the internal Web Servers, we have one on the Linux Server, using the 192.168.2.101 NIC and one on the XP Home box.  You can use port 8080 on the XP box, but there really is no point if your behind a solid Server like Linux and routers.  Apache will run on any XP Home box.

There's really no need for all the VPN stuff.  It overcomplicates things and has no more security than normal network operations.

You're real problem is the gateway and broadcast addresses; solve that problem first.  You might also want to download and install Ethereal on at least two machines to troubleshoot TCP/IP connections between machines, routers, and machines.  It's free.  Install on the two machines you're first going to test, 192.168.1.100 and 192.168.2.47

Read and learn how to use it.

Most routers are accessible via a simple Web interface, which is to say, a simple Apache interface.  Most of the web is run by Apache.  You can download that too and install it to see why.

Avatar of StevePimer

ASKER

Thanks for the quick come back.  Our plan is to use the VPN router to setup a tunnel to two other sites.
PPS:  One last note:

In a professionally secured system, you don't need Firewall software.  The Server is the firewall, the router should never be used as a firewall.

I have done everything to allow attacks on our servers and they haven't failed yet.  We do run a good Antivirus on the XP machines and firewall software on each machine; the proper use of firewall software, only because it is nearly always Microsoft that is vulnerable.

We set aside a separate Mail Server with Antivirus to filter email, email is forwarded directly to that machine where virus detection takes place.

We do have what may be considered a Virtual Private Network, but not of the usual form.  We use Split Brain and Chrooted DNS to isolate the Internet from the Intranet.

This is much more effective than firewalls and firewall software.

Firewall software is for client machines, not a Server.

These programs have never failed us or given a false positive:

http://www.guardianprograms.com/

We think they are the best there is.

We've been around a long time and our Private Network is an assigned Enterprise Number Network.  You can get one of these if you look hard enough for it at IANA.  This too is used for security within the network.

ASKER CERTIFIED SOLUTION
Avatar of this213
this213

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
I should point out, for your actual netmask, you would enter 255.255.255.0 on those routers
All taken as constructive.  No problem.  But I am allowed to disagree.  Zonealarm has a lot of false positives; depending on namebrand is often the pride that goeth before the fall.  Zonealarm itself, as I recall, has been virused.  Split Brain Chrooted jail sets permissions for the DNS where it cannot be hacked without physical site access.  Speaking of which, it would take five minutes to hack the "bridging firewall" and it has been done already.  Never say "never."

Guardian is now part of Broderbund, and older than Zonealarm.  Some call it AVK.

There are basically two approaches to security, government and corporate.  The government one works better.  See the NSA site for guidelines on security, if you really want a secured system.

Oddly enough, your answer looks just like mine, using different octects in the third position, and the use of the same subnet range for DHCP questions the need for the second router.  The second router, in the same range, means the internal and external routers are not isolated, as they would be with two different DHCP blocks.  Most corporatations I've worked for, like AT&T, Merril-Lynch, and others, use an internal Web Server for communication because it restricts access to only the Web Server of the the Server.  They do this at over 1,000,000 offices, and over 30,000,000 clients worldwide.  Works for them!

I don't think Cisco sees the job of their routers as to firewall, rather to route packets on the network and Internet.

Encryption eats up both bandwidth and cpu time, therefore, it should be limited to proprietary data, for which, you don't have to encrypt every packet.  If you use encryption with overkill zealotry, eventually the network is going to crash very, very, very hard, and masses of encrypted data will be lost.  It only takes one instance, and there's no law that says it will only come from the outside.  Sometimes it's better to log and surveil your enemies than to bury yourself in the sand.

That would be fear of sniffers; as Franklin Delano Roosevelt said,"There is nothing to fear except fear itself."  Too much fear paralyzes action when  action is needed because, having never had any action, the victim does not know how to act or react.

Most of the security guys in security magazine and in industry tend to ignore things like viruses once they have a patch for it.  I recently published a memo for Eric Howes that this approach was flat out wrong and that detecting, educating, and eliminating the threat was the correct path, not burying heads in the sand.

I suggested a popup be served to the offending client telling them they had just sent a virus to the Web Server, and where who they could contact at the offended server to get help in fixing their computer.  That's part of education, detection was by the server, eliminating is up to the informed user.

A corporation's network does not operate in a vacuum.  Which basically means there really is no such thing as a Virtual Private Network, that's why it's "Virtual," or, almost "true."  Corporations need both customers and employees to operate on the web, it IS where the business is.  It's no longer just at HQ or the office, in fact, even employees won't let it be because they want to use the web because of its ease of use, its efficiency, and its functionality in all situations.

I cited Apache because it is the leading Web Server, and will continue to be so.  With HTML, the entire concept of networking is changing radically, as httpd and html take over more and more of the tasks of the network and operating system.  The day may soon come when they are both incorporated into a new Operating System with all kinds of built in database functionality, security, etc..  It's not "just a web server."

Firewall software on a server is self-defeating; once breached, the flood cannot be stopped.  Behind all security and firewalls there should be human intelligence.  And that should be the real firewall, the human master of both the hardware and the software.  Machines can't think and they never will.  Somethings just have to be accepted for what they are, the truth.  All of the software giants have been virused for the very reason that they depended on software rather than their own intelligence.  And you can't gather intelligence while walled off in a welded shut box.

I did say every Windows box should have both AV and Firewall software installed, but I also said that the server should not be lagged with AV and Firewall software clogging its cpu time.  Instead, a human brain that knows how to configure it properly and then to monitor it is still better than any AV or Firewall software on a server.  If you don't think that that is true, you've abdicated to the superior intelligence of the machine, which it does not possess.

The DMZ is used in most small routers these days as a radio check box item and all the experts agree that it provides a false sense of security.  The term, derived from the De-Militarized Zones of various wars, is a good example as those zones also provided a false sense of security; we had based way over the other side of the DMZ, hence, it was a facade.

There are always at least two ways to do something.  You should never consider it an insult to read or hear the other way from someone else.  You shouldn't consider such sage advice as flaming, I know I don't.  Then again, you don't know if someone needs to "learn how to use" something and it normally sounds a little abrasive when you tell someone to do that; they may have a doctorate in the subject, in which case the remark would be quickly dismissed as an "uninformed" remark.

I came here to answer a few questions to test the waters.  To see if it would be worth my time to answer some questions, and, to continue working on my book, which is filled with examples from these conversations; from a collegium of names that you'd recognise, if you've been in the computer industry and know a lot about it, by education, experience, or both.  As far as I'm concerned, I'm here to gather information myself.  I don't believe in the existence of either flamers or trolls; those are stories we tell to children in Faerie Tales, like Grimm's.  I hold that a conversation can be both intelligent and diverse, even in disagreement, without losing one's sense of manhood.  It is, after all, up to the men to teach the boys.

I must say, quite well put.

Split brain chrooted DNS simply separates external DNS queries from internal DNS queries, being chrooted means DNS doesn't run as root (which it never should). That doesn't say anything for the rest of the system and its security.

Also, a properly configured bridging firewall has no IP address (on either side) and passes everything it gets to somewhere else. Because of this, it's a completely transparent device on the network. The systems behind it can't see it, nor can the systems in front of it. If someone were to crack one of these, they'd first have to do it by the MAC address of the NIC (which would be a feat in itself to obtain) they were going to try to get in on, then they'd have to get through the firewall running on it to get it to actually process packets instead of passing or dumping them. In other words, one would have to be the cream of the crop to get into one of these, and then it would still take quite a bit longer than 5 minutes. Frankly, if you can show me documentation of where someone has done this, I'd really like to see it (I'm not being sarcastic, I really would like to see it). Note that I'm not talking about a bridge device here, I'm talking about a bridging firewall - a server (usually) running some flavor of *nix (usually - well, always that I've seen) that's configured as a bridge and a firewall.

I know of both Guardian and Broderbund. I also know that any firewall software running on Windows is vulnerable simply because it runs on Windows. But then, I see all firewalls as being vulnerable, it's simply a question of how vulnerable and how easy to exploit. I mention ZoneAlarm because it's (very) user friendly (one doesn't need a CISSP to use it) and it work as well as or better than any other software firewall. Plus, every exploit I know of for it can be fixed by upgrading (or patching).

As to there being 2 different approaches to security - there are actually hundreds, if not thousands of approaches. The internet is still quite new and everyone is still figuring things out: This is why we have things like spam and hackers to even worry about. The NSA's view on security is actually such that if you really want your data protected, unplug your machine from the network - in fact, they have people who do nothing but manually enter data from one computer(network) to another. This is known in the industry as "Air Separation" - though the term isn't as meaningful with wireless networking now available. Most corporations are now starting to follow the NSA guidelines for security - due mostly to Sarbanes-Oxley.

The isolation of networks was on purpose. By far the best approach to security, virus control, and incident control is and always has been to segment your network as much as possible. This gives you better control over what goes in and comes out of each network segment. Web servers are for one way communication used by (usually) division heads, PR departments and so on. Most corporations I've worked for (I won't get into slinging names) demand some level of file sharing. Some of these do indeed use Samba, most use AD service. With the exception of Unix NFS, that's about it for file sharing. File sharing is different from the one-way communication offered by web servers and critical to the productivity of most corporate computing environments. I will state here also that if the second router were to be on the same network, then you don't need a router there, you need a switch. If the router is a requirement (for wireless access, say) then it should either be configured as an access point (not a gateway), or it should remain isolated from the wired network for security purposes (which is what I would do anyway).

I think Cisco does want their routers to be seen as firewalls, this is why they build the Pix line. Cisco has a powerful vision for network appliances to handle everything from firewalling, routing, nat, even virus protection all from one device, as do other network device manufacturers. However, that wasn't the point I was trying to make there. What I was saying was that there should be *something* in front of the servers as a first line of defense, rather than having a server itself doing all of the firewalling of the entire network.

Encryption eats up CPU time on the encrypting and decrypting systems - the in-between systems don’t do anything but pass packets. Encryption does eat up more bandwidth, that's why you have to decide if the data you're protecting is worth it or not. If that data falling into the wrong hands could cost you a fortune, you'd be an idiot not to encrypt it. If it doesn't really matter if that data escapes, why bother with encrypting it. I've never had a network crash because it was passing encrypted data - the network as a whole simply sees packets, it doesn't care what's in them. I never said to stick your head (or your computer) in the sand, but if you're running any sizable network, you'd better be aware of the threats that face it, and with new regulations on the horizon, you'd better use due and proper care in protecting your data. We don't know about every type of threat that exists for networks as a whole (as I mentioned, this is all fairly new), and we can't simply "not act" - what would be the point in having a network that you can't use? However, we can protect against those threats that we do know about, and we can attempt to be as prepared as possible for those threats we don't know about yet.

There's quite a bit more to worry about on a corporate network than "fear itself".

Your popup idea is already in use; look at Symantec, McAfee and Trend Micro for network-based virus scanners. "IT Guys" (of which I am) ignore patched viri because it's patched and we don't have to worry about it anymore. Once it's patched, any computer scanned by the network scanner gets immune because the viri gets caught (by the patch). A patch applies (usually) to a virus scanner, enabling it to catch more viri.

You're right about Apache being the leading web server, but again, Apache has nothing to do with anything but serving web pages. Apache does not do networking, it doesn't run applications (these are done with C++, Perl, PHP, Java and the client's browser). HTML stands for "HyperText Markup Language", and httpd is the "HyperText Transport Protocol Daemon" The key word here is "HyperText", while it can do linking, image displays and embedded applications, that's pretty much all it does, after all, it's just text. It doesn't even do the databasing (usually handled by MySQL). While I agree that one day we will all use an internet based operating system, it will have to be far more robust than just hypertext. As an aside, Google is working on just that - an internet based operating system.

Firewall software on a server is the last line of defense. There is a reason servers have built in firewalling capabilities. If your server is being hit so hard the firewall is impacting its performance, the reason it's being hit so hard should be dealt with (if it's an attack), or another server should be set up and load balancing applied (it it's not an attack - also with its own firewall), or, in rare occurrences (the served data needs to come from the same source for some strange reason), it should have a dedicated firewall in front of it. Note that before this "last line" firewall should have to do anything, every firewall between it and the outside world should have had to have been breached. This also means that if the server is protected by only 1 firewall and that system goes down or gets taken off line for any reason, the server isn't left unprotected.

You're right, machines can't think. They do what we tell them to, they know what we tell them to know. All of the software giants have been virused for this reason: We're human, as such we all make mistakes. In the early years of the internet, we found that we were way too trusting, that if someone finds a hole in your network they will exploit it. If someone finds a hole in your application, they will exploit it. Humans write all the software and as such, it's prone to error.

As to gathering intelligence, that's what honeypots are for. Systems designed specifically to attract crackers and log everything they do.

There's no way a human brain can sit at a terminal and scan 8 million packets per second, logging where each one came from, went to, how it made the request, if it was a crafted packet, and so forth. If your servers aren't doing any logging, there's nothing a human can do, except (perhaps) find out about it later. If your servers are logging, they can be dumping packets as well. A properly configured Linux box running iptables is firstly more secure than one not, and secondly faster than one not because it's not processing anything but what it thinks are good packets and it's dumping everything else. On top of stateful inspection, it's also doing traffic shaping, resulting in zero collisions on the nics, meaning packets don't have to be resent, resulting in lower bandwidth and faster response times. I'd like to see a human do that. You talk about a human knowing how to configure the machine properly, that's exactly what I'm saying - configuring a server properly includes setting up the firewall on the server. Besides, as I said before, it was a human (or group of them) who wrote the software to begin with. If you don't have that firewall set up, how will you know if the unspeakable happened and someone actually got in? Or are you going to wait until the home page to your dot com has been replace with something horrific?

Yes, I know about the little radio button in most small routers, and I agree: they provide a false sense of security. As I said, a DMZ is more than software, more than hardware. The DMZ, as I stated in my other post, is there as the place you want the fire directed to if one should break out.

With that I'll close. This thread is getting to be something better suited to a regular forum - and Lord knows there are plenty of them out there dealing with just these types of issues.

THIS
Thanks for all the technical responses.  I didn't mean to start a firestorm with my simple question but I have enjoyed the show nonetheless.  To make things easy at the sites I am planning on using the Linksys BEFVP41 VPN router / firewall as my first entry point behind the cable modem.  That would allow me to setup VPN tunnels to the sites I want to use in the wide area network.

Then I am going to replace the Linksys WRT54G wired / wireless router with a simple Linksys access point to bring back the wireless aspect of the network.  I am doing this because the WRT54G has no use option to set it up as a simple access point (only a router or gateway).

I will also setup an 8 port 10/100 switch to accomodate the wired connections I need (5 connections) when I remove the 4 port WRT54G router.

That setup should give me all the functionality I want.

Once again, thanks.