We run an inventory listing service with 2.1 million unique items. The script we originally had, which had worked for a long time, queried the database and built a table with "pages" of the data with 10000 records each.
Something changed in the last couple of months where the bots were beginning to hit that script a lot more and it began to bring our server to its knees with the database requests.
We finally had to disable the script and find another way. We now create daily static html pages of the data just like the old script created but these are stored in one directory which I have the bots access via robots.txt.
The static pages list one column from each product we list in our inventory listing service, approx 2.1 mil items.
I do not want the bots to access anything else on our site because of the database interaction.
Human users can't see the "pages" that we created in the bots dir but the data is viewable in other areas of the site by the human users.
The script we formerly used did exactly the same thing but was dynamic in nature. However this "dynamicness" made it unusable because of the interaction with the DB.
A human user cannot see the data in that manner.
Now could that be deemed cloaking?
If you want to check it out...
The site is http://www.listinventory.com
The first static page would be :