In our corporate, we have a Wifi that staff / guests have to register before they can
use it. Subsequently after registration, each time staff/guest connects to it, they
have to click on "Lets Surf" in the landing/captive page.
I need to monitor this landing page for possible defacement/changes as it's been
known hackers ever hacked it such that the landing page is replaced with one that
will lead to an external Internet malicious site.
I thought of writing a script to dump out this landing page every hourly using
wget or curl but somehow wget or curl (Windows version) can never connect to
it to dump out this html page: I've tried specifying proxy & without proxy.
Purpose of dumping out this landing page is such that I could do a "comp" (ie
Unix equivalent of "diff") against a baseline page to see if that page has changed.
What did I miss that wget/curl could not dump out this page? While I'm connected
to home Internet or my phone's 4G, the very same laptop could dump out, say
google.com
I ran wget from Windows command prompt while I ran curl from a "bash-like"
shell that runs on my Windows : both works on my home Internet but not
with the corporate's captive page.
When connecting each time to this 'secured' Wifi, it will lead to a URL that looks
like
http://www.wiresuss.com/login.php? . . .