monitor entry points of website

How can i regularly monitor if entry points of different websites has changed?

The Boundary Conditions are:
(1) Websites cannot be modified itself; means i test solely from outside
(2) Websites are dynamic; means website content changes
(3) Websites are of different technologies; means it could consist of: php, js, .net, java, ajax, pyhton, perl, etc.
(4) I´ve no access to the source-code

The background is: I want to be notified if functional changes of the site lead to changed entry points and in this way to a greater attack surface.
E.g. - the process is as follows:
(1) Agency develop website
(2) i test the site before site has "GoLive" -> i do this with black box penetration testing from outside
(3) Agency fixes found vulnerabilities until i´m satisfied with it
(4) i give the permission to "GoLive"
(5) EVERY FUNCTIONAL CHANGE ON WEBSITE THROUGH AGENCY WILL BE DETECTED BY ME!

Ok, i guess this is not completely/efficiently possible but at least i want to detect changes on entry points at the first stage!

Best Regards
Air
capfwAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

nap0leonCommented:
Your web analytics engine should provide this information.
All (major)  analytics providers (Google Analytics, Adobe Site Catalyst, etc.) all provide out of the box tracking of both referrers and site entry.

If adding/using web analytics is not available, you can try some "backlink" services which try to use search engine results and more to detect links to your site. (e.g., here's a random one... http://www.backlinkwatch.com/)
capfwAuthor Commented:
Hi nap0leon,
first of all - thanks for your comment!

as a said - unfortunately - website cannot be modified itself, so this is not an option.
But i will give another - hopefully more pragmatic - example:

for example i will be able to detect if a form within the site has been changed from 4 input fields to 5 input fields or an additional option to upload files.

or another example: i will be able to detect if a site has changed from not having an search field to having one.

Which means adding a new entry point for potential Cross-Site Scripting (XSS) or SQL--Injection (SQLi). So i will be able to detect all changes of the site which will increase the potential attack surface.
Reason is - if i will detect this -> i will make a new penetration test of the site.
I do not want to monitor solely the entry-link itself.

Until now - my approach is like this:
(1) i look at the site and find out all potential entry points like: http-header-input, form-field-inputs, upload-options, referrer, etc.
(2) i will generate a Hash/Digest for these found entry-points and store it with the related links.
(3) a visit the site and the related links (i stored formerly) regularly with a script and check if Hash/Digest has changed. If it has changed i will look at the site manually and decide if i penetration test the site again.


Be aware that "entry points" for me means potential input for the website which comes from outside and has to be validated, sanitized, escaped, etc. before processing it!


Best Regards
Air
btanExec ConsultantCommented:
The straight-off is having a proxy tier to intercept all the request and response and at the same time with web appl aware firewall to inspect every of such exchanges. In fact, GA fits but it comes from the performance perspective, the other aspects is more of defacement service with will drill into the various sub pages on top of the main site to detect tamper.

The defacement service and proxy with WAF will be a good baseline to explore.
Such can include for
a) defacement svc - BanffCyber, change_detect (http://www.changedetect.com/about/overview/)
b) waf proxy - mod_security (https://www.modsecurity.org/)
Big Business Goals? Which KPIs Will Help You

The most successful MSPs rely on metrics – known as key performance indicators (KPIs) – for making informed decisions that help their businesses thrive, rather than just survive. This eBook provides an overview of the most important KPIs used by top MSPs.

Dan McFaddenSystems EngineerCommented:
So in practical terms, you would like to have an inventory of the files that make up the structure of your website(s) and monitor this inventory for changes.

Your existing approach is right.  Use a script or utility that goes thru the file system and creates some sort of hash for the files that exist.  Store that information somewhere, ie: a text file with file paths and hash.  Upon the next run of the hash process, compare the previous hash with the current one and report differences.

It is important that the hashing/monitoring be done in the file system, not via an http request, since the content is dynamically genereated.

I would recommend that you shift some of the burden to the 3rd party developer to document changes, this way you can match up changes with a release document.  This way if you find a difference in the actual changes and the documented changes, you can verify whether a specific file changed or not.  If it has changed, push back on the developer to update their release notes.  If a change was not submitted by the developer, you have an item to investigate.

You could wrap the utility in a shell script to call the utility, the check for differences and email a diff report.

Dan
capfwAuthor Commented:
Many Thanks for your respons!
But please see my "Boundary Conditions". Meaning, that i´ve no option to intercept request and also no access to source code or possibility for directory traversal or directory access.
I only monitor from outside like any other normal user could (normal internet access; NO Server access at all).

Best Regards
btanExec ConsultantCommented:
Which is why defacement service is so called passive monitoring as initially it will be baseline on how the site is as-is then any changes can be flagged. There is no need for getting the site codes or putting in any physical server or system.

Another is the DDoS/CDN (content delivery n/w) approach which is also online services which in this case may be slightly "invasive" as it will requires the DNS changes for the origin site to CNAME to the CDN provider to proxy all calls before it forward into the your origin site. This option has not physical server but do sit in btw the traffic and site. Note the rule can be simply transparent or simulate passively to see the traffic and does not nothing but observe the exchanges. Some example is Cloudflare, DoSArrest etc

Others is really then being the crawler based like GA, you can explore Webpagetest (http://www.webpagetest.org/) and Seositecheckup (http://seositecheckup.com/) which they can "scan" the site though not really inspecting like what WAF security is doing. This is still performance driven but does sniff out areas to improve which optimise the exchnages and in a way help to avoid being overwhelm by public traffic in peak period or even DoS attempts.

In fact, I see that maybe a online security check may bode better as the whole idea is to check for clean bill of health for the site before it can be commissioned. in that case, I do suggest looking at following like
a) Qualys ssltest (https://www.ssllabs.com/ssltest/) - check the cipher based scheme for compliance
b) Surcuri sitecheck (https://sitecheck.sucuri.net//) - check for weak CMS used in site mgmt
c) OWASP URL Checker (https://www.owasp.org/index.php/OWASP_URL_Checker) - check for leakage
samriCommented:
Hi Air,

Interesting topic.  And the feedback is very useful as well :)

>> How can i regularly monitor if entry points of different websites has changed?

This from my understanding - is to know where visitors are coming?  If this is the case, then not possible unless you had access to the website itself (the code), and refer to the HTTP_REFERER header.  Or another option is to look at the access log (for the webserver).

However,

>> The background is: I want to be notified if functional changes of the site lead to changed entry points and in this way to a greater attack surface.

This could be achieve by using "urlwatch" (https://thp.io/2008/urlwatch/).

I hope this would help (to some portion of the need).

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Security

From novice to tech pro — start learning today.