Solved

robots.txt - Allow only one file to be indexed

Posted on 2011-02-25
7
356 Views
Last Modified: 2012-05-11
I need to allow the bots to index only one file on my site. Will that also hurt me AFA Google Adwords is concerned?
How would I limit it to one file>
I tried this and it didn't work:
Allow: /SearchGBY.php
Disallow: /

Open in new window

0
Comment
Question by:EddieShipman
  • 3
  • 3
7 Comments
 
LVL 9

Assisted Solution

by:rawinnlnx9
rawinnlnx9 earned 100 total points
ID: 34981552
This seems to be a fully-fledged discussion of "HOW" to do this:

http://en.wikipedia.org/wiki/Robots_exclusion_standard
0
 
LVL 3

Accepted Solution

by:
sergiobg57 earned 400 total points
ID: 34981638
Are you sure it didn't work?

User-agent: *
Allow: /SearchGBY.php
Disallow: /

Open in new window


Remember, even if you ask for something to not be indexed after it's indexed, you'll need to ask google to delete the data if i remember well.
So you might be still seeing the indexed content because of google's cache.
But look if it isn't out-dated, this will be the proof that it's working.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34981855
I'm not 100% sure it didn't work but with my old robots.txt, the bots were bringing my site to it's knees when crawling that file.
I have modified the file to work much better and want to allow that file only. By I think it may also be harming my Google Ads
as now I only get public service ads.
0
On Demand Webinar - Networking for the Cloud Era

This webinar discusses:
-Common barriers companies experience when moving to the cloud
-How SD-WAN changes the way we look at networks
-Best practices customers should employ moving forward with cloud migration
-What happens behind the scenes of SteelConnect’s one-click button

 
LVL 3

Expert Comment

by:sergiobg57
ID: 34982027
If your worry is about that, you should set crawl-delay to make the "crawling effect" less "killing".
The fact is that google's bots will use the links inside your php file to index.
It's very likely that even denying access wont work as expected.

Also, you can use a metatag in your files in order to make the bots just return after some days.
If you are interested i can get it from my web site for you.
I also had problems with bots.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34983247
Please. I added the Crawl-delay today but have not seen the bots touch my site, yet.
The file SearchGBY.php is actually an indexing script for my whole database of 2.5mil+ records
paginated at 10,000 records per page.

I found that one thing that was causing the slowdown was that the SQL to get those 10,000 records
was slow because the index wasn't optimal. I fixed that and now the script is blazingly fast.
0
 
LVL 3

Expert Comment

by:sergiobg57
ID: 34984984
<meta name="revisit-after" content="2 days" />

Open in new window


It can be placed btw the head tag of the document.

Now if you are using a framework, then it's much more likely that there's an option to place this metatag for you and it will be better to use that option.
If you are, then tell me which one is it and i will be able to tell you how to do that.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34988976
This one file is straight PHP and HTML, no framework involved.

Thanks for the info.
0

Featured Post

Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
internet or online marketing 4 165
Google Analytics Not Tracking 14 65
Best background services 5 176
SEO deliverables measurement from digital agency 10 88
[Part 2 of a 6 part series called SEO Basics: 5 SEO Secrets for Creating Content that Drives Traffic (http://www.experts-exchange.com/Web_Development/Internet_Marketing/Search_Engine_Optimization_SEO/A_8369-SEO-Basics-5-SEO-Secrets-for-Creating-Cont…
[Part 6 of a 6 part series called SEO Basics: 5 SEO Secrets for Creating Content that Drives Traffic (http://www.experts-exchange.com/Web_Development/Internet_Marketing/Search_Engine_Optimization_SEO/A_8369-SEO-Basics-5-SEO-Secrets-for-Creating-Cont…
This tutorial walks through the best practices in adding a local business to Google Maps including how to properly search for duplicates, marker placement, and inputing business details. Login to your Google Account, then search for "Google Mapmaker…
This Micro Tutorial will demonstrate how to add subdomains to your content reports. This can be very importing in having a site with multiple subdomains.

733 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question