Solved

sitemaps

Posted on 2013-06-18
3
262 Views
Last Modified: 2013-11-19
I have a website with about 50,000 pages.   Each is in the process of being fully optimized,.
I am researching the best way to produce the best sitemap with the best updating.
attracta.com is one of the biggest in the field and seems to offer a good selection of packages.
any thoughts and/or suggestions please?
Also how important is it to have an html sitemap as well as an xml?
0
Comment
Question by:digisel
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
3 Comments
 
LVL 58

Accepted Solution

by:
Gary earned 500 total points
ID: 39257065
There are a number of free tools online/downloadable to produce a sitemap - search Google
There is a three click rule
http://en.wikipedia.org/wiki/Three-click_rule
Your users should be able to get to any content quickly and easily i.e. they shouldn't have to click through 10 pages to get to the page they want
HTML sitemaps are supposed to make this easy to do because the visitor can instantly see all areas of your site and click through to it in one click.
Also it helps search engines to map your site better regardless of the sitemap.xml and helps seo if pages are accessible by no more than 3 levels deep.

As for attracta.com - if the field you are in is highly competitive then it won't matter what this company does. It's easy enough to get a local company listed locally as competition is likely not high. If you are targeting globally then you may be throwing money down the drain expecting another company to get you listed on the first page.
There is nothing they can do that you cannot do your self by having a good, well organised site with lots of content and good on page SEO practices.
0
 
LVL 4

Expert Comment

by:nfaria
ID: 39257089
Are you sure all of your 50 000 pages are crawlable?

In my opinion building a sitemap should be something not relying in crawlability as that is the way that the bots will get your content.

If you have the technical skills try to develop or install a software that dumps all your page links directly from your CMS database. If possible with rules for automatic categorization and prioritization by page URL patterns. Does attracta.com ease that process?

I only have XML files and I don´t see a point for the redundancy.
0
 

Author Closing Comment

by:digisel
ID: 39257147
Hi gary
Thanks.   as usual pertinent, concise and authoritative resply.
Regards
0

Featured Post

On Demand Webinar: Networking for the Cloud Era

Ready to improve network connectivity? Watch this webinar to learn how SD-WANs and a one-click instant connect tool can boost provisions, deployment, and management of your cloud connection.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Australian government abolished Visa 457 earlier this April and this article describes how this decision might affect Australian IT scene and IT experts.
When the s#!t hits the fan, you don’t have time to look up who’s on call, draft emails, call collaborators, or send text messages. An instant chat window is definitely the way to go, especially one like HipChat. HipChat is a true business app. An…
This tutorial demonstrates how to identify and create boundary or building outlines in Google Maps. In this example, I outline the boundaries of an enclosed skatepark within a community park.  Login to your Google Account, then  Google for "Google M…
This video teaches users how to migrate an existing Wordpress website to a new domain.

695 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question