Solved

Virtual Hosting and Robot.txt

Posted on 2006-06-15
6
249 Views
Last Modified: 2010-03-04
I have 2 websites running on my web server.   I am running Linux and Apache.

My question is:  How do I get my robots.txt files working for both domains?

Do I just both a robot.txt in each of the website root directories?  If so, I have done that.   But my second website still is not coming up.  My first website is still showing old stuff for 6 months ago when I search in google for it.  I have updated my site since then with major changes.

But the main one I am concerned about is the second website.   Here is a week, it will be fully functional and I want it to come up in google or other.

Thanks...



0
Comment
Question by:strongd
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 3
6 Comments
 
LVL 9

Expert Comment

by:smidgie82
ID: 16913458
As for your initial question, you need to have robots.txt in the root of each website.  However, robots.txt will in no way help you get indexed faster.  Go to Google Sitemaps and register your sites.  Then download a utility to create sitemaps and submit them.  That WILL help you get indexed faster.
0
 

Author Comment

by:strongd
ID: 16915953
How does the spider know that there is 2 websites or to different directory to go to?
0
 
LVL 9

Accepted Solution

by:
smidgie82 earned 125 total points
ID: 16916141
I think we have a difference of definition here.  By two different websites, I assume you mean two VirtualHost's running on the same or different machine.  In that case, in order to get to either website in the first place, the spider will already need to know the domain name of the website it wants.  It won't know or care that the two are on the same machine, and more than likely it won't be the same spider traversing both sites regardless.  By developing a google sitemap and submitting it to Google, you'll be giving Google's bots a starting address and a list of files you consider important to your site.  That's how they'll find your site, and they'll handle the rest from there.

If the above does not describe your situation (i.e., your two websites share a common name), then please describe your configuration in more detail so I can give appropriate advice.
0
What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

 

Author Comment

by:strongd
ID: 16916299
Thanks alot for your help.   Yes, I have 2 websites running on the same machine using virtual hosting.   So I guess I need to go to google and do the sitemap thing.

Thanks again for your help.  I think you answered my question.

0
 
LVL 9

Expert Comment

by:smidgie82
ID: 16916301
My pleasure.  Good luck!
0
 

Author Comment

by:strongd
ID: 16916319
Thanks, hopefully I can get this thing going so I can get people to my second site.   The first site was more or less a play toy/ learning thing...  

Thanks again.
0

Featured Post

Webinar: Aligning, Automating, Winning

Join Dan Russo, Senior Manager of Operations Intelligence, for an in-depth discussion on how Dealertrack, leading provider of integrated digital solutions for the automotive industry, transformed their DevOps processes to increase collaboration and move with greater velocity.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

It is possible to boost certain documents at query time in Solr. Query time boosting can be a powerful resource for finding the most relevant and "best" content. Of course the more information you index, the more fields you will be able to use for y…
Introduction This article is intended for those who are new to PHP error handling (https://www.experts-exchange.com/articles/11769/And-by-the-way-I-am-New-to-PHP.html).  It addresses one of the most common problems that plague beginning PHP develop…
Exchange organizations may use the Journaling Agent of the Transport Service to archive messages going through Exchange. However, if the Transport Service is integrated with some email content management application (such as an antispam), the admini…
This video shows how to use Hyena, from SystemTools Software, to update 100 user accounts from an external text file. View in 1080p for best video quality.
Suggested Courses

710 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question