seo with form 301 or subdomain

hillelben used Ask the Experts™
we have  a store that is running on volusion which is a shopping cart system.  the problem is that their search is not very good.  we want to make out own search page on a different server.  the question is what is the best way to do this seo wise.  ie make the search page server a subdomain like or set a 301 redirect for the form, etc

Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
Julian MatzTechnical Support
Top Expert 2005

A subdomain sounds like a good idea. 301 redirects don't always work as you might expect with HTML forms.


people told me that google doesn't look at subdomains - that they consider it like a seperate domain.  is this true?

Its true.  That's how has 5 listings for the phrase

wordpress blog
Rowby Goren Makes an Impact on Screen and Online

Learn about longtime user Rowby Goren and his great contributions to the site. We explore his method for posing questions that are likely to yield a solution, and take a look at how his career transformed from a Hollywood writer to a website entrepreneur.

You haven't got much choice with Volusion as it is a pretty closed platform where you haven't got enough control to set up a reverse proxy for a folder.

That being said, if this is search results, you don't necessarily have to have the search results indexed and can use possibly iframes, javascript or maybe a dynamic flash embed on the store pages themselves.

I would avoid having the content on a subdomain as Volusion stores tend to already have lots of problems with indexation and duplicate content anyway. Most of that is fixable but I haven't seen a store that did it effectively.


google looks at iframes, flash, and ajax? which is best to use?  ajax would be problem i guess because it has to get from a different server which i don't think is allowed in ajax
You can block Google from accessing content whichever way you implement it, whether is is just denying all bots, 301 redirecting them somewhere, having URLs which just use "#" fragments they can't navigate, or using robots.txt.

There are all kinds of "search" functions possible with custom sliders for price range selection, visual color based search etc. Which method you use doesn't really matter, it is down to your developers, but in general it is something that doesn't really offer you any benefit being indexed in Google search results.

p.s. pages blocked with robots.txt can accumulate PageRank, but if you have a single externally loaded javascript blocked by robots.txt there shouldn't be a problem.


we want to do the search because the volusion search isn't very accurate.  we have 400,000 products and the volusion search searches for the search terms with OR instead of AND which usually doesn't bring up what you want.  The thing is now that we're making a separate search how can we do it and still get indexed
You are talking about 2 seperate roles

Your search engine for users with improved accuracy and functionalty is for humans. It doesn't have to be indexed, and in fact having it indexed potentially adds to duplicate content complications especially if you will have diverse taxonomy.

Indexation by Google is something totally different & can be aided with the following

1. Eliminate duplicate content - One Volusion store I am looking at for someone has what I would class as 400 unique pages of content, yet a crawler like Xenu link sleuth sees over 30,000
In your case with 400,000 products you might be giving Google a hernia.
Google like to claim they can handle much of this themselves, but eliminating the guesswork often brings significant rewards in indexation and search traffic volumes.

2. Flattening internal linking structure. With 40,000 products I would consider 10 category based sitemaps linked sitewide, then either 40 links on each pointing to 100 links, or 100 links pointing to 40 links depending on how logical it is to do it with your product categories etc.

3. lots of link building - indexing 400,000 pages takes a fair chunk of juice.

4. Lots of unique descriptions - if you are using manufacturer descriptions there is a high chance Google will filter a lot of the pages if they don't have some original content, or lots of juice compared to others using the same content.

If the sitemaps need to be dynamic, I can certainly see a situation where they might be housed on a subdomain and generated by a dedicated application, and yes... indexed, but the primary purpose would still be to get the existing pages indexed.


thanks for the advice

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial