Solved

A tale of two websites - part two

Posted on 2015-01-03
6
137 Views
Last Modified: 2015-01-04
I receive some great help from Jason Levine and IT Saige yesterday and would like a little more info.

I started to implement their points at 10.30am GMT and will need to do a little more reading up on SEO.

Firstly, I am going through the Google Webmaster Tools and check the site.

Secondly, is there any advantage in updating several times a day verses daily verses weekly verses monthly. Is there any optimum updating level. And how would a one word change compare to a whole page of new content?

And finally, another unanswerable question, but I assume Google, MSN and yahoo apply most weight to volume of visits which I cannot do much about, but how much weight do you think they apply to the things within my control?
0
Comment
Question by:bill2013
  • 3
  • 2
6 Comments
 

Author Comment

by:bill2013
ID: 40529097
I've checked the home page on the W3 validator and has 181 errors and 1 warning. My friends home page (which gets great rankings) has 600+ errors and 7 warnings.

The majority of the warnings are "an attribute value specification must be an attribute value literal unless SHORTTAG YES is specified."

This accounts for 85% of the errors but I understand this is something the validator picks up but does not require attention.

I also have several "element "o.p." undefined".  Also several "end tag for "meta" omitted, but OMITTAG NO was specified" - is this just a case of adding / to <meta http-equiv=Content-Type content="text/html; charset=windows-1252">, i.e. charset=windows-1252"/>.

Other ones are missing attributes, e.g "there is no attribute "name" - name= "place/". Should I delete these lines?

The other message I get is:

Notes and Potential Issues

The following notes and warnings highlight missing or conflicting information which caused the validator to perform some guesswork prior to validation, or other things affecting the output below. If the guess or fallback is incorrect, it could make validation results entirely incoherent. It is highly recommended to check these potential issues, and, if necessary, fix them and re-validate the document.

    Warning No DOCTYPE found! Checking with default XHTML 1.0 Transitional Document Type.

    No DOCTYPE Declaration could be found or recognized in this document. This generally means that the document is not declaring its Document Type at the top. It can also mean that the DOCTYPE declaration contains a spelling error, or that it is not using the correct syntax.

    The document was checked using a default "fallback" Document Type Definition that closely resembles “XHTML 1.0 Transitional”.

    Learn how to add a doctype to your document from our FAQ.
0
 
LVL 53

Accepted Solution

by:
COBOLdinosaur earned 500 total points
ID: 40529285
The crawlers used by Google use parsering that is very similar to what the validator uses.  Warnins are fine, and many "errors" should be informational.  However ignoring validation errors is like skating on a pond with a thin ice warning; you might never have a problem, but if the crawler hits something it can't process it may terminate indexing the page.  Even without a serious problems, the errors will be used with a small weighting to produce the value for the quality index which has been gaining weight with Google for the past 2 years.

The number one weighting factor is content. Unique, quality content will win out over other factors. However you need to avoid negative like a high bounce rate being applied to the content because Google in placing more and more weight on the relevance as measured by user response.  That also helps to weed out those trying to game Google by stuffing in popular search terms where they do not belong.

The number of visits to your site have no real bearing on the page ranking, except as they produce input to the bounce rate.  If you have 1000 visitors and 500 of them are engaged and stay on your site several minutes that will be better for your site then 10,000 visitors where a 1000 stay.  The first has a 50% bounce rate GOOD and the second has a 90% bounce rate VERY BAD.  If you are considering trying to by traffic or use some stupid random traffic generation with something like Stumbleupon; don't do it because it do exactly zero for your SEO and just waste your bandwidth.

As for updating frequency You update when you have something new.

Finally, do not expect to see an immediate result from your efforts. For new sites it can take more than 6 months for Google to include them in results, and existing site that are already indexed will see increases from SEO efforts very gradually, except when Google does the dance (re-indexes); which happens 3-6 times a year and  makes us dance figuring what changes they have nmade to the rules.

Cd&
0
 

Author Comment

by:bill2013
ID: 40529318
Thanks Cd, that was good to read that the value of good content beating visits as I can deal with that and not much about the other,

I guess if someone is on for less than a minute it's a bounce, so let's try and come up with some unique content relevant to the headings/keywords, so once they are on they stay for a while.

I notice that if I change content Google appear to update it quite quickly. From what you are saying they don't do any reindexing at this stage but sometime in the future they reindex everything and not only move the goalposts but blindfold all the players?

As you suggest I should try to get the validation errors down to low single figures.

Any suggestions on some reading as to how to correct these errors, as I mentioned the majority are the same 3 errors?
0
IT, Stop Being Called Into Every Meeting

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

 
LVL 53

Expert Comment

by:COBOLdinosaur
ID: 40529323
Just follow the standards as published by W3C even if they are obtuse at times.

Cd&
0
 
LVL 82

Expert Comment

by:Dave Baldwin
ID: 40529422
Just being around a long time helps.  My home page though very simple has been on the web over 10 years and has no errors.  I always try to minimize the errors on my client's pages.  It's just one less set of problems to be concerned about.
0
 
LVL 53

Expert Comment

by:COBOLdinosaur
ID: 40530319
>>>Just being around a long time helps.

Yes old is good. :^)

Cd&
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

Suggested Solutions

A/B testing is a simple and effective trick to get to know your audience, increase website conversions and make the most out of your online ad campaigns. It's widely available and doesn't need much tech knowledge to be executed, but the results it y…
Building a website can seem like a daunting task to the uninitiated but it really only requires knowledge of two basic languages: HTML and CSS.
In this tutorial viewers will learn how add a scalable full-width header using CSS3. Create a new HTML document with an internal stylesheet. Set a tiled background.:  Create a new div and name it Header. Position it with position:absolute at the top…
In this tutorial viewers will learn how to style a corner ribbon overlay for an image using CSS Create a new class by typing ".Ribbon":  Define the class' "display:" as "inline-block": Define its "position:" as "relative": Define its "overflow:" as …

746 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

9 Experts available now in Live!

Get 1:1 Help Now