A different client (different from another one I mentioned here) suddenly had his site stop being spidered/cached in Google (and apparently in Yahoo).
Here's a snippet from his page HTML source: META NAME="robots" CONTENT="index,follow">
He ran a validator test and got this result regarding the robot.txt file:
Syntax check robots.txt on robots.txt (30 bytes)
Line Severity Code
4 ERROR There should be atleast 1 disallow line in any Robots.txt.
We're sorry, this robots.txt does NOT validate.
Warnings Detected: 1
Errors Detected: 1
4 warning An empty user agent field was detected. Each User-Agent record should have atleast one disallow line per record. This error may have also been generated due to bad line enders.
robots.txt source code for robots.txt
1 # Robots.txt
---------------------------- end of text result ---
Is the above file so flawed that it could stop any spiders? I think so, but wanted any feedback about this from the experts here.
Any comments/suggestions/solutions appreciated!