improving seo for similar sites

I have 4 sites with near identical content. Each site is however used for a different purpose/ audience. I would like to make sure I am not being penalised for the same content and optimise each site for its specific purpose.

The sites are.
www.111translations.com - this is the "live testing site" after internal testing this site gets updated and tested online. It is also used for suppliers and recruiting. SEO is least important here.

www.araxi.fr - this is for our French clients. They need to be pointed to the French site .fr
www.araxi.ca  - this is for our North American clients. They need to be pointed to the Canadian site .ca

www.araxi.co.uk -   this is for our UK clients. They need to be pointed to the UK site .co.uk

We are not a huge organisation but we do need to target these specific markets.

Any advice on how not to be penalised while maximising exposure...especially SEO for google in the respective countries?
LVL 1
ShawnAsked:
Who is Participating?
 
freshcontentConnect With a Mentor Commented:
My understanding is that if these are localized sites they will NOT be penalized by Google/Bing for identical content.  Having the different country-code top level domains helps a lot (CCTLDs).


Are you translating the content for the French version of the site?

I would suggest registering them in the Google & Bing webmaster tools areas so that the search engines are aware that the same person runs all three sites.

www.bing.com/webmaster 

www.google.com/webmaster

I would suggest putting a <noindex> in your robots.txt file on your "live testing site"; that way you won't have to worry about Google/Bing having issues with it.

0
 
kuzmanovicbConnect With a Mentor Commented:
Make sure to point out different purpose / audience in title, h1, meta desc and content
0
 
GuitarFingersConnect With a Mentor Commented:
As mentioned above I would suggest you do change those attributes as well. "I would" also make changes in content to reflect the audience you are serving. For example on the french site, make your content more along the lines of "french language translations" along with the business language translation. It makes your content more focused to the crowd you are serving, and less generic overall.
0
Cloud Class® Course: MCSA MCSE Windows Server 2012

This course teaches how to install and configure Windows Server 2012 R2.  It is the first step on your path to becoming a Microsoft Certified Solutions Expert (MCSE).

 
ShawnAuthor Commented:
freshcontent
>>Are you translating the content for the French version of the site?

all of the sites are in both English and French. The default home pages are English except for France which is French.

I have registered them in google but not bing yet. the noindex is a good idea for the one domain. thx.

kuzmanovicb:
agreed. I need to go over titles, H1, etc.

GuitarFingers:
personalized content is in the works


I understand adapting as much as possible will help get me noticed...
My main concern in this question is being penalized for identical content. If having CCTLDs avoids being penalised then I'm more than relieved.
0
 
DotNetChanoCommented:
http://www.copyscape.com/ is a great site to check for duplicate content, but it will only check the url you give it against what is already indexed in the search engines.

Similar to copyscape is http://www.plagium.com/

http://www.dupecop.com/compare-spun-articles.php will compare blocks of text (i.e. 2 different blocks of text from your sites that have not been indexed yet.

Good luck!
0
 
freshcontentCommented:
I haven't actually implemented it myself, but from what I've read from Matt Cutts and the Google webmaster videos/forums, you will not be penalized for duplicate content on different CCTLDs, especially if you translate when appropriate.

0
 
ShawnAuthor Commented:
DotNetChano:
thanks for the link but I already know I have duplicate content.

freshcontent:
this is the conclusion I am starting to veer towards. I'll leave the post open for a little while just in case someone has a different point of view.
0
 
DotNetChanoCommented:
Googles main purpose in penalizing for duplicate content is not to hurt people who repeat their own content, it is to keep people from stealing the original authors content. Unfortunately it is difficult for google to tell the difference, if I were you I would try my hardest to re write the content and say the same thing while keeping the keywords that are important.

Could you not have all of the domains point to the .com and use your codebehind to translate the content based on which tld the user came from?
0
 
ShawnAuthor Commented:
DotNetChano:
>> Googles main purpose in penalizing for duplicate content
are you saying I am or will be penalised for my duplicate content? this is going against what freshcontent has mentioned.

>>Could you not have all of the domains point to the .com and use your codebehind to translate the content based on which tld the user came from?
not an option. it used to be like this but we split it all up for various reasons
0
 
freshcontentCommented:
Here is an SEOmoz article (from 2009) indicating that you would not be penalized for duplicate content.

SEOmoz and Rand Fishkin are very reputable SEO sources of information.

http://www.seomoz.org/blog/new-info-from-google-and-yahoo-tilts-the-geotargeting-balance 

0
 
freshcontentConnect With a Mentor Commented:
One more interesting article (from 2011) on this subject -

http://seminsights.com/opinions/duplicate-content-international-seo 
0
 
ShawnAuthor Commented:
very nice articles. I now feel reassured i won't be penalized. thx :-D
0
 
ShawnAuthor Commented:
thx everyone! more questions to come :-)
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.