?
Solved

Protecting a directory; allowing one referrer and a second "side door"

Posted on 2005-03-04
12
Medium Priority
?
162 Views
Last Modified: 2010-05-19
Our group maintains its adminstrative records on its main site; when members log in, they are to have permission to download PDFs from the other web site (on which the members' archives are kept).  The devil is in the details:

1.  Will the .htaccess solution that is applied to the /protected/ directory then automatically apply to subdirectories within, even if created later?

2.  Is it possible to protect a directory so that only:
      referrals from a particular directory on the main site (another domain), and
      referrals from a particular direcrory on the host (same domain) are allowed?
      In essence two ways to log in (through the main site or through the document server)

 I've not been able to write an .htaccess that accomplishes this.  I'm assuming that .htaccess is the best approach here as we'll have a lot of PDFs and other documents to protect.  Authentication actually occurs via .ASP and an Access DB and then either an "OK" or "not ok" response.redirect - unless a session or cookie variable can do the same thing?  We're an educational research association so the PDFs don't contain sensitive information; it's the volume of material and search capabilities that are the benefit for our members - if I can get it to work.
0
Comment
Question by:rickxaver
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 6
  • 3
12 Comments
 
LVL 4

Expert Comment

by:Devastated
ID: 13464580
Maybe its me but i can't see a specific language for this question, dont know about the hta.access method i'm afraid but ...

a solution lies with .net (i use asp.net to accomplish this very nicely)...

the method you can use is called forms.authentication alternative methods actually authenticate using microsoft passport or windows server user details...

an excellent page for you to read to get a basic understanding is http://www.15seconds.com/issue/020220.htm

and the method to secure locations is simply that <location> that must be put inside the web.config file - along with other details of course.
0
 
LVL 10

Expert Comment

by:eeBlueShadow
ID: 13465606
Not only is it possible, it's actually quite neat, using mod_rewrite

use the following:

------------------------------
RewriteEngine on
# Stops blank referrer - means people can't just type the address into the address bar
RewriteCond %{HTTP_REFERER} !^$
# Set allowed domains in the following way
# the pattern means 'If the HTTP_REFERER doesn't start with...' - since REFERER gives a page rather than a host
RewriteCond %{HTTP_REFERER} !^http://localhost
RewriteCond %{HTTP_REFERER} !^http://www.allowed.com
# If the above are true (referer doesn't match any of above) redirect to a forbidden page
RewriteRule ^$ / [F]
------------------------------

_Blue
0
 
LVL 10

Expert Comment

by:eeBlueShadow
ID: 13465616
eek, sorry, ignore that, it's not actually finished, and doesn't take into account your authentication
0
Why Off-Site Backups Are The Only Way To Go

You are probably backing up your data—but how and where? Ransomware is on the rise and there are variants that specifically target backups. Read on to discover why off-site is the way to go.

 
LVL 10

Expert Comment

by:eeBlueShadow
ID: 13465678
OK, one .htaccess method to do this is as follows:

------------------------------
RewriteEngine on
# Stops blank referrer - means people can't just type the address into the address bar
RewriteCond %{HTTP_REFERER} ^$ [OR]
# Set allowed domains in the following way
# the pattern means 'If the HTTP_REFERER doesn't start with...' - since REFERER gives a page rather than a host
RewriteCond %{HTTP_REFERER} !^http://localhost/allowed_dir/
RewriteCond %{HTTP_REFERER} !^http://www.allowed.com
# If the above are true (referer doesn't match any of above) redirect to a forbidden page
RewriteRule ^.*$ / [F]
------------------------------

Using this method, only links from the specified sites/directories will be allowed through. However, really determined people can get their computer to fake the HTTP_REFERER header, and therefore get around this protection if they work out what's going on

_Blue
0
 

Author Comment

by:rickxaver
ID: 13474411
eeBlueShadow, thanks, I think I had most of the right parts in most of the wrong places.  Now may I throw a curve?  I will need to add an authorized user (a search engine spider, actually) that can peruse this directory.  the search engine just needs to know a uname and pword do do so, but I'm not sure how to add this single-user authorization to this??
0
 
LVL 10

Expert Comment

by:eeBlueShadow
ID: 13474905
It's possible, does this spider know how to deal with the 401 HTTP header?
0
 
LVL 10

Expert Comment

by:eeBlueShadow
ID: 13474924
Or, if this spider needs to authenticate to your ASP/Access system, that's a bit trickier, and likely I won't be able to answer that (but I'll give it a go if you let me know)
0
 

Author Comment

by:rickxaver
ID: 13475986
Yes, the spider knows how to deal with a 401; no, ASP authentication isn't an issue.  ...thanks!
0
 
LVL 10

Accepted Solution

by:
eeBlueShadow earned 2000 total points
ID: 13476249
OK, I have to admit I can't see an obvious way to get the 401 Authentication to work at the same time as the rewrite, wothout requiring everyone to see the authentication request. The easiest way I can see to do this is to add another RewriteCond to deal specifically with the spider's IP address

RewriteCond %{REMOTE_ADDR} !^xxx\.xxx\.xxx\.xxx$

This is definitely not an ideal solution, but I'm afraid I can't think of a better one at the moment - I very rarely use Apache authentication
0
 

Author Comment

by:rickxaver
ID: 13476519
"The easiest way I can see to do this is to add another RewriteCond to deal specifically with the spider's IP address"

this might work - if it does not I'll work-around the spider issue by crawling local files instead of what's on the webiste (not ideal but OK).
We'll be done, you'll get the points, and 2500 members of an educational research association will have benefited from your advice.  Take a bow!
0

Featured Post

On Demand Webinar - Networking for the Cloud Era

This webinar discusses:
-Common barriers companies experience when moving to the cloud
-How SD-WAN changes the way we look at networks
-Best practices customers should employ moving forward with cloud migration
-What happens behind the scenes of SteelConnect’s one-click button

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Styling your websites can become very complex. Here I'll show how SASS can help you better organize, maintain and reuse your CSS code.
JavaScript has plenty of pieces of code people often just copy/paste from somewhere but never quite fully understand. Self-Executing functions are just one good example that I'll try to demystify here.
Viewers will learn about basic arrays, how to declare them, and how to use them. Introduction and definition: Declare an array and cover the syntax of declaring them: Initialize every index in the created array: Example/Features of a basic arr…
The viewer will learn how to dynamically set the form action using jQuery.
Suggested Courses

765 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question