Solved

robots.txt - Allow only one file to be indexed

Posted on 2011-02-25
7
353 Views
Last Modified: 2012-05-11
I need to allow the bots to index only one file on my site. Will that also hurt me AFA Google Adwords is concerned?
How would I limit it to one file>
I tried this and it didn't work:
Allow: /SearchGBY.php
Disallow: /

Open in new window

0
Comment
Question by:EddieShipman
  • 3
  • 3
7 Comments
 
LVL 9

Assisted Solution

by:rawinnlnx9
rawinnlnx9 earned 100 total points
ID: 34981552
This seems to be a fully-fledged discussion of "HOW" to do this:

http://en.wikipedia.org/wiki/Robots_exclusion_standard
0
 
LVL 3

Accepted Solution

by:
sergiobg57 earned 400 total points
ID: 34981638
Are you sure it didn't work?

User-agent: *
Allow: /SearchGBY.php
Disallow: /

Open in new window


Remember, even if you ask for something to not be indexed after it's indexed, you'll need to ask google to delete the data if i remember well.
So you might be still seeing the indexed content because of google's cache.
But look if it isn't out-dated, this will be the proof that it's working.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34981855
I'm not 100% sure it didn't work but with my old robots.txt, the bots were bringing my site to it's knees when crawling that file.
I have modified the file to work much better and want to allow that file only. By I think it may also be harming my Google Ads
as now I only get public service ads.
0
Master Your Team's Linux and Cloud Stack!

The average business loses $13.5M per year to ineffective training (per 1,000 employees). Keep ahead of the competition and combine in-person quality with online cost and flexibility by training with Linux Academy.

 
LVL 3

Expert Comment

by:sergiobg57
ID: 34982027
If your worry is about that, you should set crawl-delay to make the "crawling effect" less "killing".
The fact is that google's bots will use the links inside your php file to index.
It's very likely that even denying access wont work as expected.

Also, you can use a metatag in your files in order to make the bots just return after some days.
If you are interested i can get it from my web site for you.
I also had problems with bots.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34983247
Please. I added the Crawl-delay today but have not seen the bots touch my site, yet.
The file SearchGBY.php is actually an indexing script for my whole database of 2.5mil+ records
paginated at 10,000 records per page.

I found that one thing that was causing the slowdown was that the SQL to get those 10,000 records
was slow because the index wasn't optimal. I fixed that and now the script is blazingly fast.
0
 
LVL 3

Expert Comment

by:sergiobg57
ID: 34984984
<meta name="revisit-after" content="2 days" />

Open in new window


It can be placed btw the head tag of the document.

Now if you are using a framework, then it's much more likely that there's an option to place this metatag for you and it will be better to use that option.
If you are, then tell me which one is it and i will be able to tell you how to do that.
0
 
LVL 26

Author Comment

by:EddieShipman
ID: 34988976
This one file is straight PHP and HTML, no framework involved.

Thanks for the info.
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
transfer domain name from godaddy 4 71
page rank 9 75
how Adwords bidding happens 1 59
How do I remove / delete my personal information from a website? 9 91
A/B testing is a simple and effective trick to get to know your audience, increase website conversions and make the most out of your online ad campaigns. It's widely available and doesn't need much tech knowledge to be executed, but the results it y…
This code takes an Excel list of URL’s and adds a header titled “URL List”. It then searches through all URL’s in column “A”, looking for duplicates. When a duplicate is found, it is moved to the top of the list. The duplicate URL’s are then highlig…
This tutorial demonstrates how to identify and create boundary or building outlines in Google Maps. In this example, I outline the boundaries of an enclosed skatepark within a community park.  Login to your Google Account, then  Google for "Google M…
This Micro Tutorial will demonstrate how to add subdomains to your content reports. This can be very importing in having a site with multiple subdomains.

773 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question