Solved

Dynamic robots.txt  to block SE's from https versions.

Posted on 2009-04-02
3
632 Views
Last Modified: 2013-12-08
Hi,
We have a site as site as http://www.xxx.com and also have the secured URL https://secure.xxx.com which serves the same contents of http://www.xxx.com.  We already have a robots.txt file in the root directory http://www.xxx.com which also serves for https://secure.xxx.com. What we need is to block Google(and other se's) from crawling https versions. So we are planning to create a new robots file with code for not to crawl https versions.
          We got a chance to see this link, http://www.kleenecode.net/2007/11/17/dynamic-robotstxt-with-aspnet-20/
, it is very helpful because they are telling we can create different robots.txt file for http and https. And also you can see a code for Robots.txt to prevent from crawling to https versions. So to implement this concept, we have done the following steps as the instructions in the post.
1) Create a Robots.txt with the given code in the root of our web project.
 After that for checking, i browse robots.txt file, then i can see the full code as i typed.
2)To get the path to the ASPX engine,
    a) Open IIS and right click on our website and bring up the properties screen
    b) Go to Home Directory > Configuration. then got Mappings Tab
    c) Locate the ASPX item and click Edit - Copy the path in the Executable Field and cancel out of that window
 3) Create the ISAPI entry for .txt .
     a) Still on "Mappings Tab"
     b) Click "Add"
     c) Populate the Executable path with the value I copied in the last section    
     d) Enter GET in the Limit To field
     e) Enter ".txt" in the "Extension"  field(they are not specifying this one)
     f) Press "OK" to save all my changes.
Next for checking the result , they are telling when we browse robots.txt file , will get a blank page. But we are not getting blank page, we got a page with the full code as i typed.
Could anyone help me to resolve this? If anyone t-ry this dynamic creation of robots.txt, just share their experience also.
Thanks in advance for any help
0
Comment
Question by:olmuser
3 Comments
 
LVL 51

Expert Comment

by:ahoffmann
ID: 24067222
Are you adicted to IIS?
With apache you can use mod_rewrite to server different robots.txt
0
 
LVL 23

Accepted Solution

by:
Tiggerito earned 250 total points
ID: 24070089
Your configuration lets you intercept requests in Asp.Net.

So what you need to do now is write the code that intercepts request to robots.txt and dynamically writes the response.

On e way to do this is to write and register an Http Module (IHttpModule).

Here's some code I quickly whipped up that should do what you want. secure robots.txt request will contain the data from a secure-robots.txt file.

// web.config for registering a module

<configuration>

	<system.web>

		<httpModules>

			<add type="FileMapperModule" name="FileMapperModule"/>

		</httpModules>

	</system.web>

</configuration>
 

// module that intercepts secure robots.txt requests

public class FileMapperModule : IHttpModule

{

    public void Init(System.Web.HttpApplication Appl)

    {

        Appl.BeginRequest += new System.EventHandler(Rewrite_BeginRequest); // interecept requests

    }

	

	public void Rewrite_BeginRequest(object sender, System.EventArgs args)

    {

        HttpApplication Appl = (HttpApplication)sender;
 

		// if its a secure request for the robots.txt file 

		if (Appl.Request.AppRelativeCurrentExecutionFilePath.StartsWith("~/robots.txt") && Appl.Request.IsSecureConnection)

		{

			Appl.Context.RewritePath("~/secure-robots.txt); // return the content of secure-robots.txt

		}

	}

}

Open in new window

0

Featured Post

What Is Threat Intelligence?

Threat intelligence is often discussed, but rarely understood. Starting with a precise definition, along with clear business goals, is essential.

Join & Write a Comment

Suggested Solutions

Periodically we have to update or add SSL certificates for customers. Depending upon your hosting plan you may be responsible for the installation and/or key generation. In the wake of Heartbleed many sites were forced to re-key. We will concen…
If you don't have the right permissions set for your WordPress location in IIS, you won't be able to perform automatic updates. Here's how to fix the problem.
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
The viewer will learn how to look for a specific file type in a local or remote server directory using PHP.

705 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

19 Experts available now in Live!

Get 1:1 Help Now