Solved

Virtual Hosting and Robot.txt

Posted on 2006-06-15
6
250 Views
Last Modified: 2010-03-04
I have 2 websites running on my web server.   I am running Linux and Apache.

My question is:  How do I get my robots.txt files working for both domains?

Do I just both a robot.txt in each of the website root directories?  If so, I have done that.   But my second website still is not coming up.  My first website is still showing old stuff for 6 months ago when I search in google for it.  I have updated my site since then with major changes.

But the main one I am concerned about is the second website.   Here is a week, it will be fully functional and I want it to come up in google or other.

Thanks...



0
Comment
Question by:strongd
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 3
6 Comments
 
LVL 9

Expert Comment

by:smidgie82
ID: 16913458
As for your initial question, you need to have robots.txt in the root of each website.  However, robots.txt will in no way help you get indexed faster.  Go to Google Sitemaps and register your sites.  Then download a utility to create sitemaps and submit them.  That WILL help you get indexed faster.
0
 

Author Comment

by:strongd
ID: 16915953
How does the spider know that there is 2 websites or to different directory to go to?
0
 
LVL 9

Accepted Solution

by:
smidgie82 earned 125 total points
ID: 16916141
I think we have a difference of definition here.  By two different websites, I assume you mean two VirtualHost's running on the same or different machine.  In that case, in order to get to either website in the first place, the spider will already need to know the domain name of the website it wants.  It won't know or care that the two are on the same machine, and more than likely it won't be the same spider traversing both sites regardless.  By developing a google sitemap and submitting it to Google, you'll be giving Google's bots a starting address and a list of files you consider important to your site.  That's how they'll find your site, and they'll handle the rest from there.

If the above does not describe your situation (i.e., your two websites share a common name), then please describe your configuration in more detail so I can give appropriate advice.
0
Supports up to 4K resolution!

The VS192 2-Port 4K DisplayPort Splitter is perfect for anyone who needs to send one source of DisplayPort high definition video to two or four DisplayPort displays. The VS192 can split and also expand DisplayPort audio/video signal on two or four DisplayPort monitors.

 

Author Comment

by:strongd
ID: 16916299
Thanks alot for your help.   Yes, I have 2 websites running on the same machine using virtual hosting.   So I guess I need to go to google and do the sitemap thing.

Thanks again for your help.  I think you answered my question.

0
 
LVL 9

Expert Comment

by:smidgie82
ID: 16916301
My pleasure.  Good luck!
0
 

Author Comment

by:strongd
ID: 16916319
Thanks, hopefully I can get this thing going so I can get people to my second site.   The first site was more or less a play toy/ learning thing...  

Thanks again.
0

Featured Post

Get your Disaster Recovery as a Service basics

Disaster Recovery as a Service is one go-to solution that revolutionizes DR planning. Implementing DRaaS could be an efficient process, easily accessible to non-DR experts. Learn about monitoring, testing, executing failovers and failbacks to ensure a "healthy" DR environment.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

As Wikipedia explains 'robots.txt' as -- the robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a websit…
In Solr 4.0 it is possible to atomically (or partially) update individual fields in a document. This article will show the operations possible for atomic updating as well as setting up your Solr instance to be able to perform the actions. One major …
In this brief tutorial Pawel from AdRem Software explains how you can quickly find out which services are running on your network, or what are the IP addresses of servers responsible for each service. Software used is freeware NetCrunch Tools (https…
Sometimes it takes a new vantage point, apart from our everyday security practices, to truly see our Active Directory (AD) vulnerabilities. We get used to implementing the same techniques and checking the same areas for a breach. This pattern can re…
Suggested Courses

622 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question