Link to home
Start Free TrialLog in
Avatar of nigelsponge
nigelsponge

asked on

Keeping PDF Files secure from google search

I have created a secure log-in for a client who has a page with links to pdf files. The page is secure and you must log-in to see these links. The problem is that a user was searching goggle and found a link to the pdf file, which he was able to download. with a previous client where I created a digital download. To prevent users browsing directly to the file, I created a folder outside the web root and put all the files in there. With the server I am working with inthis case , they don't allow any access to files outside the web root. I know I can create an htaccess file for the particular folder but we are trying to avoid having the user log-in to the page and than again log in evrytime they want to look at a pdf file.

Is there another way to prevent a user from searching on google and accessing the pdf file directly?

Avatar of aleghart
aleghart
Flag of United States of America image

Google cannot search into database-driven pages.  It uses crawling technology.
That is why database-driven e-commerce sites will generate thousands of crawler-compatible pages.

Have you thought about storing the files in a database?  User logon and clicks are passed to a backend SQL database to fetch files.
Avatar of nigelsponge
nigelsponge

ASKER

This is a good idea, I will try this and get back.

Thank You, VJC
ASKER CERTIFIED SOLUTION
Avatar of noci
noci

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Other things:

Don't allow directory access, only access though an index.html/index.php etc.
use the robots.txt as a guard. (And allow it to be read).
Place the files in a subdirectory.

Setup directory security on the subdirectory to disallow anonymous access, and require basic authentication,  or edit the server configuration and use a .
<Directory /path/to/locked/dir>
  Deny from all
</Directory>

In the parent directory create a script that when run opens a file in the subdirectory and sends the file to the user's browser  (bypassing the directory security).

Make your special script validate that the user has authenticated with the secure login.


Either that   or use HTACCESS site-wide for all client pages, and place them all as subdirectories of the common directory.

Instead of implementing login/logout yourself, have the webserver and the web browser handle it.

Use the server-side variables within your mod_php scripts to determine which user is currently logged in.

$_SERVER['PHP_AUTH_USER']




Search engines are supposed to respect the webmaster's wishes, so robots.txt and no follow/no index tags should be used as linked to above:
http://www.google.com/support/webmasters/bin/answer.py?answer=93708&topic=8846
It works for us, Yahoo, Culi, MSN, Google and even Archive.org all adhere to our wishes via the robots and no follow tags. http://www.archive.org/about/exclude.php
-rich
thank you