allow 8 pages and deny the rest

1 - (optional and often overlooked) one pointer to your sitemap (or to a single sitemap index if you want to indicate several sitemaps files).
The sitemap is just suggestions to help the spiders to spider your site and to speed their discovery of it. They bring NO guarantee that the pages you indicate will be eventually indexed, not even that they will be spidered in the next hours. And of course the sitemap is not a restricted list of what should indexed,nor that nothing else should.
So, what's wrong with this pointer in robots.txt not being used in the next 24h?

Maybe my robots.txt can just say look at sitemap.xml and nothing else

I only want 8 pages in google and the rest of the pages on the website, I would like hidden

allow: page1
allow: page2
allow: page3

deny the rest
Who is Participating?
Dave BaldwinConnect With a Mentor Fixer of ProblemsCommented:
That won't happen.  Google also picks up links to your content pages on other sites.  And as long as those links are up, it is nearly impossible to get them taken down.

Also note that only the 'good' robots pay attention to 'robots.txt'.  The others don't even read it.  You can not use 'robots.txt' or your sitemap to actually hide anything.  Bots from spammers and hackers read everything that has a link in their search for compromising information.
Ray PaseurConnect With a Mentor Commented:
Bots from spammers and hackers read everything...
As does the NSA.

If you put anything online you are saying that you want to give it away freely, that you expect it to be taken, copied, altered, repurposed, republished, mocked, stolen and sold.  If you do not want these things to happen, do not put it online.  It's really that simple.
rgb192Author Commented:
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.