Solved

robots.txt explained

Posted on 2012-03-12
2
445 Views
Last Modified: 2012-03-13
1) Does every website have a robots.txt file?
2) Is every websites robots.txt file publicly accessible/downloadable - if so from where?
3) What is the point in them, for example if you have an entry in robots.txt for /admin - then if the file is publicly accessible then what has it actually solved? I.e. how are you any better off in hiding /admin from google if someone can download your robots.txt and then see you actually have /admin directory on the server?

I cant see the logic or what the point in adding entries in it are?
0
Comment
Question by:pma111
2 Comments
 
LVL 53

Accepted Solution

by:
COBOLdinosaur earned 250 total points
ID: 37710248
No every site does not have one.  

It needs to be accessible so the 'bots can find it. The purpose of robot.txt is to tell the spiders not to index things.  Without direction they will index everything they find.  If you don't want the directory name exposed put it in a higher level folder and deny at the higher level

You prevent the public from accessing the admin or anything else with .htaccess

If you have something that is sensitive it should not be on the web server, because a hacker will always find a way to see it.


Cd&
0
 
LVL 15

Assisted Solution

by:Ess Kay
Ess Kay earned 250 total points
ID: 37710267
1> no
2> yes typically-->   website.com/robots.txt
3> entries entered here are for the webcrawling robots. it will allow/disallow to crawl through certain sections of your website and index them into the search engines such as google, yahoo, bing


if you have certain pages which you dont want to be added to the search engine such as your admin login page, you might want to add it here, so that common folk will not see it when they search for you site



as far as true hackers, the lack of a robots.txt file will not stop them



you dont have to add all of the admin pages, only the first portal

once the robots stops there it will not go further through that page's links
0

Featured Post

Use Case: Protecting a Hybrid Cloud Infrastructure

Microsoft Azure is rapidly becoming the norm in dynamic IT environments. This document describes the challenges that organizations face when protecting data in a hybrid cloud IT environment and presents a use case to demonstrate how Acronis Backup protects all data.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Preface This is the third article about the EE Collaborative Login Project. A Better Website Login System (http://www.experts-exchange.com/A_2902.html) introduces the Login System and shows how to implement a login page. The EE Collaborative Logi…
Although it can be difficult to imagine, someday your child will have a career of his or her own. He or she will likely start a family, buy a home and start having their own children. So, while being a kid is still extremely important, it’s also …
This tutorial demonstrates how to identify and create boundary or building outlines in Google Maps. In this example, I outline the boundaries of an enclosed skatepark within a community park.  Login to your Google Account, then  Google for "Google M…
The viewer will learn how to look for a specific file type in a local or remote server directory using PHP.

770 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question