Solved

robots.txt - Allow only one file to be indexed

Posted on 2011-02-25
7
357 Views
Last Modified: 2012-05-11
I need to allow the bots to index only one file on my site. Will that also hurt me AFA Google Adwords is concerned?
How would I limit it to one file>
I tried this and it didn't work:
Allow: /SearchGBY.php
Disallow: /

Open in new window

0
Comment
Question by:EddieShipman
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 3
7 Comments
 
LVL 9

Assisted Solution

by:rawinnlnx9
rawinnlnx9 earned 100 total points
ID: 34981552
This seems to be a fully-fledged discussion of "HOW" to do this:

http://en.wikipedia.org/wiki/Robots_exclusion_standard
0
 
LVL 3

Accepted Solution

by:
sergiobg57 earned 400 total points
ID: 34981638
Are you sure it didn't work?

User-agent: *
Allow: /SearchGBY.php
Disallow: /

Open in new window


Remember, even if you ask for something to not be indexed after it's indexed, you'll need to ask google to delete the data if i remember well.
So you might be still seeing the indexed content because of google's cache.
But look if it isn't out-dated, this will be the proof that it's working.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34981855
I'm not 100% sure it didn't work but with my old robots.txt, the bots were bringing my site to it's knees when crawling that file.
I have modified the file to work much better and want to allow that file only. By I think it may also be harming my Google Ads
as now I only get public service ads.
0
Salesforce Made Easy to Use

On-screen guidance at the moment of need enables you & your employees to focus on the core, you can now boost your adoption rates swiftly and simply with one easy tool.

 
LVL 3

Expert Comment

by:sergiobg57
ID: 34982027
If your worry is about that, you should set crawl-delay to make the "crawling effect" less "killing".
The fact is that google's bots will use the links inside your php file to index.
It's very likely that even denying access wont work as expected.

Also, you can use a metatag in your files in order to make the bots just return after some days.
If you are interested i can get it from my web site for you.
I also had problems with bots.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34983247
Please. I added the Crawl-delay today but have not seen the bots touch my site, yet.
The file SearchGBY.php is actually an indexing script for my whole database of 2.5mil+ records
paginated at 10,000 records per page.

I found that one thing that was causing the slowdown was that the SQL to get those 10,000 records
was slow because the index wasn't optimal. I fixed that and now the script is blazingly fast.
0
 
LVL 3

Expert Comment

by:sergiobg57
ID: 34984984
<meta name="revisit-after" content="2 days" />

Open in new window


It can be placed btw the head tag of the document.

Now if you are using a framework, then it's much more likely that there's an option to place this metatag for you and it will be better to use that option.
If you are, then tell me which one is it and i will be able to tell you how to do that.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34988976
This one file is straight PHP and HTML, no framework involved.

Thanks for the info.
0

Featured Post

Salesforce Made Easy to Use

On-screen guidance at the moment of need enables you & your employees to focus on the core, you can now boost your adoption rates swiftly and simply with one easy tool.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This code takes an Excel list of URL’s and adds a header titled “URL List”. It then searches through all URL’s in column “A”, looking for duplicates. When a duplicate is found, it is moved to the top of the list. The duplicate URL’s are then highlig…
Some of the SEO trends we might expect in 2017.
This tutorial walks through the best practices in adding a local business to Google Maps including how to properly search for duplicates, marker placement, and inputing business details. Login to your Google Account, then search for "Google Mapmaker…
This Micro Tutorial will demonstrate how to add subdomains to your content reports. This can be very importing in having a site with multiple subdomains.

691 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question