Solved

htaccess and directory listing

Posted on 2013-12-04
6
310 Views
Last Modified: 2013-12-06
hi.

i have a "downloads" directory in my site that contain files ...
i need to prevent directory listing directly from the browser
ex.  http://mysite/downloads/

but i need to enable it when i call it inside the index.html
ex.  <A href="/downloads/">downloads</A>

how to do such thing.

Note: i have succeded to prevent direct access to files of type jpg like this:

http://localhost/downloads/sample.jpg
(htaccess)
RewriteCond %{HTTP_REFERER} !^http://(www\.)?localhost [NC]
RewriteCond %{HTTP_REFERER} !^http://(www\.)?localhost.*$ [NC]
RewriteRule \.(exe|jpg)$ - [F]

but how to create an expresion for the whole folder , mean to disable access when access from  http://mysite/downloads/ but enable it from  <A href="/downloads/">downloads</A>

please help
thanks a lot
0
Comment
Question by:weissman
  • 3
  • 3
6 Comments
 
LVL 19

Expert Comment

by:xterm
ID: 39696124
When you put a hyperlink to /downloads in the code, and the user clicks it, they're just going to pull up the page and be subject to the same htaccess rules as when they first loaded the page.

If you want to show the files in a directory with browsing disabled by htaccess (or by httpd.conf - you can turn off DirectoryIndex in Options too) then you will need to script something to read the directory and dynamically build the page.  It's very easy in PHP if you have that installed on your server.  Here is a sample script:

<?php

$folder="/var/www/html/ee";
$handle=opendir($folder);
while (false!==($filename=readdir($handle))) {
        if ( ($filename!="..")&&($filename!=".") ) {
                echo "<a href=$filename>$filename</a><br>\n";
        }
}

?>

Open in new window

0
 

Author Comment

by:weissman
ID: 39698494
first i would thank you for your help....

the problem that my boss prefer not to use php and use a configuration in the httpd.conf

he is Convinced that there should be a solution without rewriting the page in php

i have found this link

http://perishablepress.com/stupid-htaccess-tricks/#sec3

i don't know if i can write a rule for directory

what do you think?

thanks.
0
 
LVL 19

Expert Comment

by:xterm
ID: 39698524
It doesn't have to be PHP, it could be CGI or SSI, or java(script) but the point is, if you deny clients the ability to list a directory, then the only way you're going to be able to show them the contents of that directory is by having the web server scan that folder and generate a dynamic listing.

You have to understand that <a href=/downloads> doesn't actually do anything until somebody clicks it.  And when they click it, they then load the folder just as if they'd gone do it directly from a bookmark, or some other method.  In either (and every) case, they will then be subject to whatever rules are in the .htaccess file for that directory (or in httpd.conf if specified there instead, but ultimately it's the same thing)
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 

Author Comment

by:weissman
ID: 39698669
thanks again.

i am understanding from your answer that :

<a href=/downloads/>  is equals to <a href=http://www.example.com/downloads/>

is that right?

i just want to be sure , are you sure that there are no solution that gave me access from

Relative url

Relative /images/downloads/

But forbid access from

Absolute http://www.example.com/downloads/

The issue that i need directory listing but not from absolute url

thanks very much.
0
 
LVL 19

Accepted Solution

by:
xterm earned 500 total points
ID: 39698829
The relative link vs. the absolute link is only a shortcut for referencing files in your code, but the web server will see the two requests as the same thing.  The only difference is that one will have a different referer.  See my two apache logs below:

Direct access to full http:// URL
10.0.0.1 - - [05/Dec/2013:10:51:10 -0600] "GET /ee/downloads/ HTTP/1.1" 304 - "-" "Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/30.0.1599.66 Safari/537.36"

Access via relative hyperlink on the page above
10.0.0.1 - - [05/Dec/2013:10:51:13 -0600] "GET /ee/downloads HTTP/1.1" 301 237 "http://mydomain.net/ee/downloads/" "Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/30.0.1599.66 Safari/537.36"

So one option would be instead of turning off indexes and trying to allow them based on referer (which won't work) maybe you could instead redirect anybody back to your main site if they weren't referred from there.

So, something like this for .htaccess in the downloads directory:

RewriteEngine on
RewriteCond %{HTTP_REFERER} .
RewriteCond %{HTTP_REFERER} !http://yoursite\.com/index.html [NC]
RewriteRule ^(.*)$ http://yoursite.com/index.html [R=301,L]

Open in new window


That says if the referring page was anything other than yoursite.com/index.html, then refer them to yoursite.com/index.html
0
 

Author Comment

by:weissman
ID: 39700617
Thanks very much I think your suggested solution Is perfect.
0

Featured Post

Ransomware: The New Cyber Threat & How to Stop It

This infographic explains ransomware, type of malware that blocks access to your files or your systems and holds them hostage until a ransom is paid. It also examines the different types of ransomware and explains what you can do to thwart this sinister online threat.  

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In my time as an SEO for the last 2 years and in the questions I have assisted with on here I have always seen the need to redirect from non-www urls to their www versions. For instance redirecting http://domain.com (http://domain.com) to http…
As Wikipedia explains 'robots.txt' as -- the robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a websit…
This video shows how to use Hyena, from SystemTools Software, to bulk import 100 user accounts from an external text file. View in 1080p for best video quality.

830 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question