cloaking

Cloaking is a search engine optimization (SEO) technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page.


matt cutts of google states that websites who have two different websites (one for google to see) are punished in the search engine


forbes.com does not show googleBOT a pay-wall or ask googlebot to sign up for their website

so forbes.com is cloaking

Is this word used anymore or does this word only apply to flash websites (no words) where a totally different text page was for bots to see
LVL 1
rgb192Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Dave BaldwinFixer of ProblemsCommented:
so forbes.com is cloaking
I don't think so.  It's only considered cloaking if you are trying to fool the search engine.  If you look at http://www.forbes.com/robots.txt you will see that Forbes has blocked the search engines from scanning much of their website.
Aaron TomoskyDirector of Solutions ConsultingCommented:
Another form of cloaking is having HTML for googlebot, but using JavaScript to replace or hid div sections.
Jackie Man IT ManagerCommented:
http://www.freemarketingzone.com/search-engine-optimization/cloaking.html

Cloaking are being used as follows.

Flash Technology Based Websites
There are lots of web sites that are built in flash only. As you may know, flash is not recommended for SEO purposes. However, companies don't want to rebuild their web sites and rewrite everything in plain HTML. So they create content rich web pages and provide them to SEs and flash pages to visitors. (Note that Google has already started indexing sites in plain flash, but many other search engines still don't).

Image Gallery Websites
Image gallery websites obviously have more images than the actual content on their pages. So, webmasters think that cloaking could help them get top placement for keywords like "image gallery".

HTML Rich Websites
Search engine optimization experts suggest having less HTML code and more real text on your pages. Sometimes, source code takes 95% or 98% of the whole web page's content. In order to avoid redesigning every web page, webmasters discover a much easier solution - cloaking.
Your Guide to Achieving IT Business Success

The IT Service Excellence Tool Kit has best practices to keep your clients happy and business booming. Inside, you’ll find everything you need to increase client satisfaction and retention, become more competitive, and increase your overall success.

Ray PaseurCommented:
I think you've got pretty good answers here, but I'll add a bit of a question of my own to the mix.  Consider pages that are Single-Page-Applications (SPA), such as those made from AngularJS applications.  Examples abound: Twitter, Facebook, Ebay, parts of Amazon, many news sites, etc.  The goal of this design is to have a web site that more closely mimics a native phone app (knowing that there is more mobile traffic on the internet every day).  These sites typically do not load their content until you scroll it into the viewport.

The AngularJS framework is a Google invention.  I doubt they would penalize sites for using this, right?  I think the answer is that they would not penalize sites for using AngularJS, but it's worth considering the question.

There are thoroughly legitimate reasons to load an HTML document that is all or partly stubs, then flesh out the visual material with JavaScript.  It frustrates screen scraper applications, something you want to do with CAPTCHA applications (see using AJAX to hide the CAPTCHA string).  It's also something you might want to do if part of your content is licensed, and it's a technique that is useful in certain security issues.

I normally browse with cookies and local storage turned off, selectively turning them on for sites I choose.  An interesting effect I've seen recently is a blank screen from CNN.  They have since walked that "feature" back.  And you may have read of the recent video "click fraud" cases, where simulated browsers were elevating the view counts on YouTube (a Google property).  Clearly the browser behavior is an important issue.  For Google to have business integrity, it needs to know that its 'bots are seeing the same things that humans are seeing.

I understand Matt Cutts position.  He was especially concerned about sites that showed Google something that was materially different from what was presented to the client via the browser.  And much of Google's popularity algorithm (in the early 2000s) was based on links - how many other sites linked to your site*.  But I think there are some soft areas today, and the Google Penalty is largely understood by webmasters, who now know well enough to follow the guidelines.  Cloaking, of the deceptive kind described in the old article from FreeMarketingZone, is mostly an ancient artifact.  Nobody would do that any more.

* Today, Google has the money and resources to use Google bots that precisely mimic a client browser, and the machine learning technology to "see" what you see.  But in those early days, Google was a screen scraper, reading the HTML and content and building its indexes from clear text.  I exploited this by creating common page footers among all the sites I supported.  The page footers were essentially invisible to the human client, but the screen scraper saw HTML loaded with keywords and links that looked just like all the other HTML on the page.  To say this was successful would be to wildly understate the conditions.  At one time, I had the top search results for "National" and the client was a private elementary school.  That's what Matt Cutts was responding to -- a deliberate attempt to manipulate Google's results.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Brandon LyonSenior DeveloperCommented:
There is  a policy called "first click free"  where Google allows you to paywall after a certain number of free visits.
rgb192Author Commented:
thanks best is browsing habits
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Search Engine Optimization (SEO)

From novice to tech pro — start learning today.