Solved

sitemaps

Posted on 2013-06-18
3
258 Views
Last Modified: 2013-11-19
I have a website with about 50,000 pages.   Each is in the process of being fully optimized,.
I am researching the best way to produce the best sitemap with the best updating.
attracta.com is one of the biggest in the field and seems to offer a good selection of packages.
any thoughts and/or suggestions please?
Also how important is it to have an html sitemap as well as an xml?
0
Comment
Question by:digisel
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
3 Comments
 
LVL 58

Accepted Solution

by:
Gary earned 500 total points
ID: 39257065
There are a number of free tools online/downloadable to produce a sitemap - search Google
There is a three click rule
http://en.wikipedia.org/wiki/Three-click_rule
Your users should be able to get to any content quickly and easily i.e. they shouldn't have to click through 10 pages to get to the page they want
HTML sitemaps are supposed to make this easy to do because the visitor can instantly see all areas of your site and click through to it in one click.
Also it helps search engines to map your site better regardless of the sitemap.xml and helps seo if pages are accessible by no more than 3 levels deep.

As for attracta.com - if the field you are in is highly competitive then it won't matter what this company does. It's easy enough to get a local company listed locally as competition is likely not high. If you are targeting globally then you may be throwing money down the drain expecting another company to get you listed on the first page.
There is nothing they can do that you cannot do your self by having a good, well organised site with lots of content and good on page SEO practices.
0
 
LVL 4

Expert Comment

by:nfaria
ID: 39257089
Are you sure all of your 50 000 pages are crawlable?

In my opinion building a sitemap should be something not relying in crawlability as that is the way that the bots will get your content.

If you have the technical skills try to develop or install a software that dumps all your page links directly from your CMS database. If possible with rules for automatic categorization and prioritization by page URL patterns. Does attracta.com ease that process?

I only have XML files and I don´t see a point for the redundancy.
0
 

Author Closing Comment

by:digisel
ID: 39257147
Hi gary
Thanks.   as usual pertinent, concise and authoritative resply.
Regards
0

Featured Post

Online Training Solution

Drastically shorten your training time with WalkMe's advanced online training solution that Guides your trainees to action. Forget about retraining and skyrocket knowledge retention rates.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Preparing an email is something we should all take special care with – especially when the email is for somebody you may not know very well. The pressures of everyday working life stacked with a hectic office environment can make this a real challen…
Today it’s fairly well known that high-performing websites and applications bring in more visitors, higher SEO, and ultimately more sales. By the same token, downtime is disastrous for companies and can lead to major hits on a brand, reputation, an…
This tutorial will teach you the core code needed to finalize the addition of a watermark to your image. The viewer will use a small PHP class to learn and create a watermark.
Use Wufoo, an online form creation tool, to make powerful forms. Learn how to selectively show certain fields based on user input using rules to gather relevant information and data from your forms. The rules feature provides you with an opportunity…

730 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question