I need a program that runs through all the links on my website (it would be nice if it could automatically open subdirs etc.) and check if they still work ok.
However, this program must have one very important feature. The website I wish to check uses PHP. It contains many pages, but all of them use the index.php file as "base file" (the content changes via include(), based on the URL (via GET)). For example: "www.site.com?blah=first" will open index.php and include the contents of the file that is referred to by "first".
Link checkers seem to grab the targeted pages from the server they're housed on. Of course, this poses a problem if those pages are built with PHP (all links are constructed via a PHP function, so there is not really a <A HREF="blah.html"> tag to be found on the site.) Of course, once the PHP is parsed and sent to the browser, this will translate into "ordinary" hyperlinks. But the link checker doesn't act as a browser and checks before the code is parsed (grabs from server).
Any idea if a program exists that can circumvent this problem?