I have a hosted site but with two domain names pointing to it with basically the difference being that one is a ".com" and the other a ".ie".
So I'm trying to think of a way to create a sitemap for each domain and have robots.txt point to correct sitemap depending upon which domain was used to access the site:
mysite.ie -> robots.txt -> sitemap: http://www.mysite.ie/iesitemap.xml
mysite.com -> robots.txt -> sitemap: http://www.mysite.com/comsitemap.xml
My current thinking is something like creating the sitemap on the fly when robots.txt directs the crawler:
mysite.ie -> robots.txt -> sitemap: http://www.mysite.ie/sitemapbuilder.php
mysite.com -> robots.txt -> sitemap: http://www.mysite.com/sitemapbuilder.php
Where sitemapbuilder.php will build the sitemap on demand, putting in the domain suffix as required (".ie" or ".com"). The output of sitemapbuilder.php would then be identical xml to that of a sitemap.xml file.
Would that work? I have no idea tbh. Next question.... if it does work, in Google Webmaster Tools could I just link to the sitemapbuilder.php file or is it strictly a *.xml file?