Criteria for selecting web pages to protect against Defacement

We have hundreds of public facing web pages : do we protect all of them or only those

a) that when defaced will cause our corporate image to be affected?
b) or web pages that have injection & certain vulnerabilities?  What are they?
c) or only web pages that are frequently used?
d) for announcement page that announces when is out service downtime, is this a good
    candidate web page to protect?
sunhuxAsked:
Who is Participating?
 
btanConnect With a Mentor Exec ConsultantCommented:
Since it is already mentioned to be belonging to your company (with trademarks, logo) , these represents the company - as statement to the member of public ( regardless if it is an advisory put up, or company news published etc) the company assurance for their confidence and trust in the company. Once this trust is breached in any way due to misrepresentation from the website, it is very hard to (or will never be) recover.

Imagine the website is being defaced (even if it is just a static one) or being under DDoS or hosting malvetisements, these have bigger implication to the eventually the company to answer to the public and importantly to authority on legal implication.

As a whole, if the websites is to be in the internet accessible by public , it is due diligence and due care to make it is as-is intended and not subjected to worst off protection as compared to any of your other websites. It is tough to say which is more critical than others though not impossible to categorized them to their criticality of the services and even the hosted domain name like . edu or .gov has greater significances to public as compared to .net or .com or .org (it is debatable and varied in perceptive).

The total effect is tremendous till you really have gotten the hit - do not even allow such opportunity to happen when you clearly know you can minimize the exposure. If really the budgeting is a challenge the intangible risk  should also be taken into account for the total damages if website is compromised - it can be far more exceed the cost to maintain it. If the cost of having to expose it or even able to tolerate to being compromised, then consider if it really need to be published as website or reachable via internet. Risk assessment has to take  place probably - every website put up has a purpose in business running.

It is not a "if" but "when" so do not have after though as the media comms will already have bring the downfall to the company or the stakeholder - do not neglect that if those websites are targeted due to their connectivity into the intra system, the damage of the internal system can be bigger and more enticing to attacker.
0
 
Terry WoodsConnect With a Mentor IT GuruCommented:
Things to consider:

1. If an attacker gets access to a server, then they may be able to send email, launch attacks on other devices on the internet, or publish/serve advertising, malware, or scams. Once compromised, backdoors may be put in place that make it vulnerable to further attacks.
2. If an attacker gets access to publish content, then they may be able to serve advertising, malware, or scams.
3. IP addresses of compromised systems can be blacklisted, demoting search rankings and potentially causing staff web traffic to be blocked or emails to be rejected.
4. There are legal responsibilities to protect private data.

Vulnerabilities differ by system, so there's little point going into detail when it may not apply to you. In general, having someone knowledgeable enough to maintain systems is important, and avoiding too much complexity is also important. When failing with either of those, software tends to miss out on important security updates.

Thinking about security on the scale of individual pages is the wrong level of thinking; thinking about systems as a whole is more important. If a system as a whole allows amateur users to introduce security holes, that's a failure of the system and its management, rather than something to try to blame the user for.
0
 
sunhuxAuthor Commented:
I understand from our Cyber Defacement protection vendor that
monitoring
   http://a.b.c/
does not automatically lead to monitoring
   http://a.b.c/1.aspx
so need to monitor both, is this true?


One colleague suggested the following criteria:
web pages that are the main pages or that are easily visible ought to
be given priority for defacement protection while those pages that
need to be navigated several levels down will be less likely to be
defaced, thus lower priority in monitoring them .....  is this true?
0
Turn Raw Data into a Real Career

There’s a growing demand for qualified analysts who can make sense of Big Data. With an MS in Data Analytics, you can become the data mining, management, mapping, and munging expert that today’s leading corporations desperately need.

 
sunhuxAuthor Commented:
& he further adds : web pages that needs to be navigated down but
contain 'view-only' information without critical transactions taking
place in them can be given least priority of defacement monitoring.

So if a web page that is several navigations away involves entering
crucial data entry/transactions, then these pages ought to be given
priority for monitoring
0
 
sunhuxAuthor Commented:
Defacement causes loss of reputation but he further adds that injecting links
that lead to malicious sites is of more concern
0
 
Terry WoodsConnect With a Mentor IT GuruCommented:
Yes, it's true that less-visited pages are less likely to be targeted, however some attacks are quite subtle and may intentionally avoid being obvious. I came across one case where a site appeared pristine when visited, but malicious data had been added to the meta data of the page in such a way that the preview-snippet in the Google search had been modified.

It is inefficient though to manually monitor individual pages, unless they are so valuable it justifies dedicated labour. It is better use of time to set up automatic monitoring tools, or add a feature that allows users to report something that's not right. The larger the scale of the system, the more effort should go into automation and efficiency.
0
 
btanConnect With a Mentor Exec ConsultantCommented:
For defacement, in general the provider will ask the domain of customer as well as the list of pages under which to protect. They go by the no of pages based on the licence procured.

Defacement has learnt the baseline of the pages during the tuning processes. So on event there are changes done on the pages, it need to relearn and establish new baseline otherwise it is going ro be a false positive.

Besides defacement controls, consider the WAF which prevents web attack attempt.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.