?
Solved

how to prevent from hacker?

Posted on 2002-03-09
4
Medium Priority
?
305 Views
Last Modified: 2010-03-04
Hi!

I have "Apache 1.3.20" running on a W2K Professional (with the latest updates and patches). There also is a firewall with only port 80 open.

For some days I recognice some hacking-attempts in the Apache-logfiles. They look like that:

[07/Mar/2002:05:42:56 +0100] "GET /scripts/..%c0%2f../winnt/system32/cmd.exe?/c+dir HTTP/1.0"
[07/Mar/2002:05:43:00 +0100] "GET /scripts/..%c0%af../winnt/system32/cmd.exe?/c+dir HTTP/1.0"
[07/Mar/2002:05:43:06 +0100] "GET /scripts/..%c1%9c../winnt/system32/cmd.exe?/c+dir HTTP/1.0"
[07/Mar/2002:05:43:10 +0100] "GET /scripts/..%%35%63../winnt/system32/cmd.exe?/c+dir HTTP/1.0"

or like that:

[09/Mar/2002:00:50:31 +0100] "HEAD%00 /%20HTTP/1.0%0D%0A%0D%0AAccept%3A%20iwtrdhdadvfywmscucw/../../index.html%3fmrsxhilfr...(very long url)
[09/Mar/2002:00:50:31 +0100] "HEAD%00 /%20HTTP/1.0%0D%0A%0D%0AAccept%3A%20xphpmlcsnnuwdujjuu/../../index.html%3fehejlygttu...(very long url)
[09/Mar/2002:00:50:32 +0100] "HEAD%00 /%20HTTP/1.0%0D%0A%0D%0AAccept%3A%20goybttoeilhqsmcd/../../index.html%3fsnvllphwrlwi...(very long url)
[09/Mar/2002:00:50:34 +0100] "HEAD%00 /%20HTTP/1.0%0D%0A%0D%0AAccept%3A%20xeifqclittjisuajdfg/../../index.html%3ftppyjwryf...(very long url)
[09/Mar/2002:00:50:35 +0100] "HEAD%00 /%20HTTP/1.0%0D%0A%0D%0AAccept%3A%20pwvrmurzqtf/../../index.html%3fvvjpeixwvq=/../is...(very long url)


It seems like a tool (regarding the short timeintervals).
I know, that Apache 1.3.20 is resistant against these kinds of "hacking", but it really nerves (floods my logs)!

What could I do?

Is it usefull to redirekt such requests? Could I block a user making such requests for - lets say - 1 hour?

I'm thankful for every advice!

Thanks in advance, Panther
0
Comment
Question by:pantherchen
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
4 Comments
 
LVL 15

Accepted Solution

by:
samri earned 900 total points
ID: 6854242
One possibility to block such request is to use <Location> Directive.  Decide on where to apply the Directive (global or virtualhost), and add the following code;

<Location /*exe>
  Deny from all
</Location>

Infact,  if you look in httpd.conf, you might find this config segment.  You can tailor to your need.

# There have been reports of people trying to abuse an old bug from pre-1.1
# days.  This bug involved a CGI script distributed as a part of Apache.
# By uncommenting these lines you can redirect these attacks to a logging
# script on phf.apache.org.  Or, you can record them yourself, using the script
# support/phf_abuse_log.cgi.
#
#<Location /cgi-bin/phf*>
#    Deny from all
#    ErrorDocument 403 http://phf.apache.org/phf_abuse_log.cgi
#</Location>

0
 
LVL 1

Author Comment

by:pantherchen
ID: 6854569
I think, thats what i need; I'll try (could take some days :)!

Could I use the % in this directive too?
Or combine it in that way:

<Location /*(%|exe)*>
 Deny from all
</Location>


THX
0
 
LVL 15

Expert Comment

by:samri
ID: 6854626
I could not confirm that, I think *.exe should be sufficient since /*.exe already took care of %.   I think the directive should be able to accept regex format (regular expression).

Test it out first, and you can further refine the regex.
0
 
LVL 1

Author Comment

by:pantherchen
ID: 6860038
ok, played a bit with it, and it works :)
thanks!
0

Featured Post

Flexible connectivity for any environment

The KE6900 series can extend and deploy computers with high definition displays across multiple stations in a variety of applications that suit any environment. Expand computer use to stations across multiple rooms with dynamic access.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

As Wikipedia explains 'robots.txt' as -- the robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a websit…
Introduction As you’re probably aware the HTTP protocol offers basic / weak authentication, which in combination with the relevant configuration on your web server, provides the ability to password protect all or part of your host.  If you were not…
Michael from AdRem Software explains how to view the most utilized and worst performing nodes in your network, by accessing the Top Charts view in NetCrunch network monitor (https://www.adremsoft.com/). Top Charts is a view in which you can set seve…
In this video, Percona Solution Engineer Dimitri Vanoverbeke discusses why you want to use at least three nodes in a database cluster. To discuss how Percona Consulting can help with your design and architecture needs for your database and infras…
Suggested Courses
Course of the Month13 days, 3 hours left to enroll

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question