verify ftp site on google

Hello
I am trying to verify one of our FTP site by adding google html verification file but I am getting 500 access denied error when I try to run from the browser.
Please can someone advise how can I run the HTML page from the FTP site?
Thanks
arthur112Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

DrDamnitCommented:
You can't. Google indexes ftp sites if a page refers to it. Then, while it is there, it will recurse the directory structure.

The verify. Html file cannot be loaded in the manner you're attempting to use because they are expecting an http resource.

If you want to make sure your ftp gets indexed, make sure you refer to it multiple times in your site.

This is a bad idea, however, since ftp poses a security risk unless you've taken the proper precautions to secure it as read only.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
arthur112Author Commented:
Hello
Thanks for your input.
Unfortunately lots of data which was sitting on FTP got indexed by Google and they are visible on the Google Cache Result pages. I am trying to remove the URL but there are more than 2000 indexed pages.
I was trying to remove the whole site instead of independant pages but its not looking possible anyway.
Any suggestion please?
Thanks
0
DrDamnitCommented:
You'll need to use the robots.txt file in your FTP site. Next time they come by, they will read the robots.txt file, see that it shouldn't be indexed, and stop indexing.

They may or may not immediately de-index the content; however, eventually the content will expire, and it will fall out the bottom of google.

Make sure you put as many directives as possible in the robots.txt file to tell google "this content is old, and you shouldn't index it anymore"

https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt

Focus on the disallow directive.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
HTML

From novice to tech pro — start learning today.