?
Solved

Web strippers and server security

Posted on 2004-08-22
6
Medium Priority
?
263 Views
Last Modified: 2010-08-05
I tested webstripper(www.webstripper.net) and I was able to download my own web site.
I am using IIS 5.1  on XP Pro. What is correct way to secure the web site on IIS and prevent web strippers to download entire web site. My cocern is asp sitting on the server and also Access database files.
0
Comment
Question by:webtrack123
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
6 Comments
 
LVL 33

Assisted Solution

by:shalomc
shalomc earned 270 total points
ID: 11870880
There are a couple of rules to follow that will greatly add to the security of your web site.

1: Place all backoffice critical files (that includes the database) in a separate directory tree, better yet - on a different drive.

2: configure your IIS carefully. Remove executing and scripting permissions from directories which do not contain scripts.

3: Rename all your include files to .asp extension.

4: Use NTFS ACLs. Remove all unnecessary permissions from non IIS folders.

5: Use IISLOCKDOWN and URLSCAN to secure your IIS.

6: Review all of the HTML comments for sensitive information.


These measures will not stop webstripping tools to download your site as HTML, but at least will stop any classified information from leaking out.


ShalomC
0
 
LVL 14

Expert Comment

by:alimu
ID: 11878095
In addition to comments from ShalomC, check out this link for instructions on how to configure a robots.txt file: http://support.microsoft.com/default.aspx?scid=kb;en-us;217103
http://www.webmasterworld.com/forum93/140.htm also has a link to a sample robots.txt file.

This file gives directives to web crawlers on what pages they are allowed/disallowed from searching.
Be aware that there are "friendly" robots you may want to allow (eg: google) so that your site can be advertised on search engines.
0
 
LVL 33

Expert Comment

by:shalomc
ID: 11878229
Hey,
In one hand, the robots.txt file is great in setting rules for friendly crawlers.
Non friendly crawlers, on the other hand, ignore it altogether.
On the gripping hand, the robots.txt file may disclose to unfriendlies more information than you planned for, like your entire directory structure.

So, from the security point of view, be very careful with what you put in the robots.txt file. For example, if you have a testing directory, do not put in in this file.

ShalomC
0
Moving data to the cloud? Find out if you’re ready

Before moving to the cloud, it is important to carefully define your db needs, plan for the migration & understand prod. environment. This wp explains how to define what you need from a cloud provider, plan for the migration & what putting a cloud solution into practice entails.

 

Author Comment

by:webtrack123
ID: 11898701
Thank you ShalomC,
regarding your reply:

1. Done
2. Done
3. Done
4. Could you send me some link about ACLs. I am not clear what is this.
5.Also what is  IISLOCKDOWN and URLSCAN
6. Done

Just need clarification about above points 4 and 5
Referring robots I will follow your proposals.

Alimu thank you for your input.
0
 
LVL 14

Accepted Solution

by:
alimu earned 150 total points
ID: 11898879
This is a tool put out by MS called Baseline Security Analyser.  http://www.microsoft.com/technet/security/tools/mbsahome.mspx
It will give you some detailed information on your server's current state and how you can lock it down further.
Suggestions also have links to get to required hotfixes and tools (like the security toolkit that includes IISLockdown and URLScan).

With respect to the comments from ShalomC, I am not sure whether the robots configuration will stop webstripper (which can masquerade as a browser) but I have noticed there are many robots.txt files out there that contain a line denying access to webstripper so it's worth a try.  

Bear in mind that anything you put on the internet (unless secured by password or other means) is available to all.  Site crawlers such as webstripper behave - and often look - like offline browsers.  If you are concerned about the security of your information either secure it in some way or think twice about putting it out there.  
AJ.
0
 
LVL 33

Expert Comment

by:shalomc
ID: 11900011
Hey,
When I said ACLs, I meant the inherent security and permissions system built in the NTFS file system.
Since IIS runs in some context under some account, you should limit it to only what it needs to run the web site.
IISLockdown and URLSCAN can be found here
http://www.microsoft.com/windows2000/downloads/recommended/iislockdown/default.asp

Google for a lot of reference information.

ShalomC
0

Featured Post

New feature and membership benefit!

New feature! Upgrade and increase expert visibility of your issues with Priority Questions.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Have you ever sent email via ColdFusion and thought of tracking this mail to capture the exact date and time when the message was opened ?  If yes, then this article is for you ! First we need a table user_email with columns user_id , email , sub…
What You Need to Know when Searching for a Webhost Provider
NetCrunch network monitor is a highly extensive platform for network monitoring and alert generation. In this video you'll see a live demo of NetCrunch with most notable features explained in a walk-through manner. You'll also get to know the philos…
In this video, Percona Solution Engineer Dimitri Vanoverbeke discusses why you want to use at least three nodes in a database cluster. To discuss how Percona Consulting can help with your design and architecture needs for your database and infras…
Suggested Courses

770 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question