• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 330
  • Last Modified:

Error: Not Acceptable! An appropriate representation of the requested resource could not be found on this server. This error was generated by Mod_Security

hi
I'm getting the following error on my bluehost server when i use a certain script.

Not Acceptable! An appropriate representation of the requested resource could not be found on this server. This error was generated by Mod_Security.

I think it has something to do with the .htaccess file, if so how can I edit it to remove this issue ?

Here is my htaccess file from the public_html folder (some URLs replaced with 'mydomain'):

RewriteEngine on
# Use PHP5.6 as default
# AddHandler application/x-httpd-php56 .php
RewriteCond %{HTTP_HOST} ^mydomain\.net$ [OR]
RewriteCond %{HTTP_HOST} ^www\.mydomain\.net$
RewriteCond %{REQUEST_URI} !^/[0-9]+\..+\.cpaneldcv$
RewriteCond %{REQUEST_URI} !^/\.well-known/pki-validation/[A-F0-9]{32}\.txt(?:\ Comodo\ DCV)?$
RewriteRule ^/?$ "http\:\/\/www\mydomain\.com\/" [R=302,L]


# php -- BEGIN cPanel-generated handler, do not edit
# Set the “ea-php56” package as the default “PHP” programming language.
<IfModule mime_module>
  AddType application/x-httpd-ea-php56 .php .php5 .phtml
</IfModule>
# php -- END cPanel-generated handler, do not edit

Open in new window


Thanks a lot
0
xenium
Asked:
xenium
  • 6
  • 5
3 Solutions
 
gr8gonzoConsultantCommented:
Per this page:
https://www.tipsandtricks-hq.com/apache-mod-security-update-how-to-fix-error-406-or-not-acceptable-issue-259

Backup your .htaccess file if you have one in the public_html directory.

Open the .htaccess file with any text editor and observe the lines between the “# BEGIN WordPress” and “# END WordPress” tags. Make sure the lines look somewhat like the following. If not then update the file with the following content and upload it to the ‘public_html’ directory.

# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
# END WordPress

Open in new window

0
 
xeniumAuthor Commented:
The htaccess file in the question does not have those tags. If I should add them, where should they be inserted?

Thanks a lot
0
 
gr8gonzoConsultantCommented:
Sorry, I assumed you were using Wordpress - that's typically where it shows up.

Basically, that error is a "file not found" type of error. So you're requesting something that doesn't exist.

I don't quite understand the purpose of the htaccess rules you have in place right now. It looks like it reads:

1. When the HTTP host is either mydomain.net or www.mydomain.net
2. And when the URI being requested does NOT look like /12345.abcdef.cpaneldcv
3. And when the URI being requested does NOT look like /.well-known/pki-validation/ABCDEF0123456789ABCDEF0123456789.txt (optionally ending with " Comodo DCV"

...then redirect any empty requests (e.g. "http://www.mydomain.net/") to http://www.mydomain.net.

The last RewriteRule seems wrong:
RewriteRule ^/?$ "http\:\/\/www\mydomain\.com\/" [R=302,L]

If all domains mentioned are actually the same domain, then this seems like it would create an infinite loop in some circumstances, so I'm assuming you're redirecting from one domain to another.

Can you provide an example of a URL that generates the error?
0
Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

 
xeniumAuthor Commented:
original htaccess file

RewriteEngine on
# Use PHP5.6 as default
# AddHandler application/x-httpd-php56 .php
RewriteCond %{HTTP_HOST} ^enex\.net$ [OR]
RewriteCond %{HTTP_HOST} ^www\.enex\.net$
RewriteCond %{REQUEST_URI} !^/[0-9]+\..+\.cpaneldcv$
RewriteCond %{REQUEST_URI} !^/\.well-known/pki-validation/[A-F0-9]{32}\.txt(?:\ Comodo\ DCV)?$
RewriteRule ^/?$ "http\:\/\/www\.autoreconuk\.com\/" [R=302,L]


# php -- BEGIN cPanel-generated handler, do not edit
# Set the “ea-php56” package as the default “PHP” programming language.
<IfModule mime_module>
  AddType application/x-httpd-ea-php56 .php .php5 .phtml
</IfModule>
# php -- END cPanel-generated handler, do not edit

Open in new window

0
 
gr8gonzoConsultantCommented:
Okay, wow. So I hadn't read your last post, and that changes a LOT of things. First let's tackle the minor stuff.

In the response headers, I see:
HTTP/1.1 406 Not Acceptable
Server: nginx/1.12.2
Date: Thu, 01 Mar 2018 17:22:18 GMT
Content-Type: text/html; charset=iso-8859-1
Content-Length: 226
Connection: keep-alive

You're using nginx for your web server, which doesn't support .htaccess files which means your .htaccess file isn't even being used at all. The .htaccess files are used exclusively by Apache.

Second, if I try to access the URL without any parameters, I don't get the error, which means that the issue isn't a missing file but rather that mod_security is detecting something that it deems unsafe and is blocking the request.

If I start re-building the content piece-by-piece and start playing with the parameters, I find that it gives the error after sufficient encoded HTML that includes hrefs, which almost certainly means that mod_security is simply trying to protect you from injection attacks.

Third and MOST importantly, what you're doing is a REALLY bad idea, and I'm not sure why nobody brought that up in your last post. You really DO NOT want to be able to pass in arbitrary content that is added to the page output - that's an absolute dream gift to hackers and social engineers. You're theoretically letting ANYONE use your domain.

I could use this page to pretend I had control over your domain by crafting a URL that displayed a message or injected Javascript, and then share those crafted URLs with people. The functionality of this page may seems flexible to you in a good way, but it's a double-edged sword - it's EXTREMELY flexible to people who know how to abuse it. It's why mod_security blocked it in the first place - it looks just like a malicious injection attack on your site.

You should always be the one in full control of your pages / domain. If visitors put in their own content, it should be very narrowly-defined content (e.g. "I will allow visitors to put a first name into this first name textbox"), and should be sanitized prior to ANY use.
0
 
xeniumAuthor Commented:
Thanks a lot for the info and feedback, very interesting. I'd like to understand these risks better. Is it simply a question of deception (hacker can present a webpage from my domain as if i produced it) or is there a deeper risk of allowing the hacker to make changes on my server? If just the former, then I'd still like to answer the question as is, and then raise a follow-up to manage such risks.
0
 
gr8gonzoConsultantCommented:
The risk depends on what you do with the input, and whether or not you sanitize the data in any way.

For example, let's say your script took the input and executed it with PHP to get its output (so that things like variables would be dynamically executed). In that case, a hacker could feasibly gain a degree of control over your server by injecting malicious PHP code into the parameter.

Other scenarios are people who craft parameters that link back to various sites with the intent of gaining free SEO backlinks, and submitting that crafted URL to search engines, which could lead to your domain getting blacklisted for abuse.

It's hard to say for sure what all risks there are without actually seeing the code or actively doing pen testing on your site. Regardless, it's always an extremely risky practice to give that degree of control to the public.  

That said, to resolve this, you're probably running the default ruleset for modsecurity, so you'd likely have to disable that ruleset and build your own that excludes the inspection of this particular URL. But again, that's just "unlocking" one more layer of security (you should never think of security as an on-or-off thing like a light switch, but rather something that has multiple layers like an onion, where you make the reward worth less than the effort).

If you want to just validate that it's modsecurity, you can pop into your nginx.conf file and find the ModSecurityEnabled line and turn it off and restart nginx and rerun your test, but don't just leave it off!
0
 
xeniumAuthor Commented:
Thanks again.  I'll see if i can do the nginx.conf test to validate that.

Is there a way to ensure only HTML can be processed?  no php or server side processes

Or..

The page is not intended for the public, so if there's a way to validate the source of the request would that be one safeguard option?

What other options might there be? Apart from not allowing the script.

Thanks
0
 
gr8gonzoConsultantCommented:
As far as other options go, that depends on what you're trying to do. If there's some consistent behavior that you're trying to create, you could pass in parameters that are used to build the HTML instead of just passing in the HTML itself. For example, let's say that you wanted to just build page to build search engine shortcuts for given keywords. One way to do this would be to pass in your parameters as array values, like this:
https://www.enex.net/autorecon/MagicURL.htm?google[]=apples&bing[]=oranges&google[]=watermelons

Open in new window


PHP will automatically process those values so that your $_GET array looks like this:
Array
(
    [google] => Array
        (
            [0] => apples
            [1] => watermelons
        )

    [bing] => Array
        (
            [0] => oranges
        )

)

Open in new window


From there, you can loop through and build your final HTML:
<?php

// Define the URLs for each search engine (you can add as many of these as you want)
$searchEngineURLs = array(
  "bing" => "https://www.bing.com/search?&q=KEYWORD",
  "google" => "https://www.google.com/search?&q=KEYWORD"
);

$generatedHTMLPieces = array();
foreach($_GET as $searchEngine => $keywords)
{
  // Do a couple of checks to make sure we're only processing correct input (if not, use "continue" to skip over the value)
  if(!isset($searchEngineURLs[$searchEngine])) { continue; }
  if(!is_array($keywords)) { continue; }

  // Now generate and append HTML
  foreach($keywords as $keyword)
  {
    // Get the search engine URL template
    $searchEngineURL = $searchEngineURLs[$searchEngine];
    
    // Encode the keyword for URL and for HTML (for the display of the link text)
    $url_keyword = urlencode($keyword);
    $html_keyword = htmlentities($keyword);
    
    
    // Copy the URL template we defined earlier for the search engine and replace the "KEYWORD" text with the actual search keyword
    $generatedHTMLPieces[] = "<a href=\"" . str_replace("KEYWORD", $url_keyword, $searchEngineURL) . "\">Search {$searchEngine} for {$html_keyword}</a>";
  }
}

// At this point, $generatedHTMLPieces should be an array consisting of URLs, like this:
// $generatedHTMLPieces[0] => https://www.google.com/search?&q=apples
// $generatedHTMLPieces[1] => https://www.google.com/search?&q=watermelons
// $generatedHTMLPieces[2] => https://www.bing.com/search?&q=oranges

// Now just join it together by <br /> tags and display the result
echo implode("<br />", $generatedHTMLPieces);

Open in new window


So not only is your URL much shorter, but more importantly, the HTML is controlled completely by you. By encoding the keyword parameters before using them in the generated HTML, you avoid injection attacks. For example, if a hacker tried to do this:

https://www.enex.net/autorecon/MagicURL.htm?google[]=<script>some XSS attack here</script><?php some evil php here ?>

...it would end up generating the proper Google search for that text rather than the text being interpreted as HTML or Javascript or PHP.

One more step you could take would be to use POST data instead of GET data. This would require a small change in how you're sending the data across (which could be pretty simple depending on how you're generating the URL today). So every URL would simply look like:

https://www.enex.net/autorecon/MagicURL.htm

...and all the input would be in the POST array instead. There are two advantages to this:
1. Logging - Your web server's access log will typically record the full URL that was requested. This means that if there's a lot of data being sent over in the query string, your web server logs will be filled with all of that different data, and it can lead to some big access logs and also lead to more difficult log analysis. In other words, it will be harder to see how many times MagicURL.htm was hit because the log analyzer will probably split out each unique URL (which includes the query string parameter). Additionally, if there's ever any sensitive data passed through the URL (even accidentally), then that data gets copied into the log files, which can be a security problem.

If you use POST, then the log files will be lighter, cleaner, and will not pose security problems by containing potentially-sensitive data from the URLs.

2. Slightly Harder to Hack - Again, security is about layers and not all layers are meant to be foolproof - sometimes it's just about making things just 1% harder. Manipulating a query string is as easy as changing the URL and hitting enter.

If someone's ever poking around your system and looking for ways to mess with your application, then using POST data instead will add a thin layer of difficulty, since it would require the use of debugging tools or custom-built scripts, which means extra time that this person has to spend on your application. It's still easy and free to bypass, but sometimes just this can be enough to dissuade casual, drive-by attempts.

Finally, never assume that just because a page isn't intended for use by the public, that it means that it will never be exposed to the public by accident or even be attacked by someone internally. Sometimes developers make the mistake of assuming that an employee would never want to harm their employer or would never try to mess with internally-facing applications. Small, medium, or large - the size of the business doesn't matter. If you get some young, hacker wannabe that gets hired as a sales guy and he figures it's safe to practice on your scripts, or maybe he wants to play a prank on someone or impress a coworker that he's attracted to... there are a thousand reasons that someone would do something stupid.

I've also seen internal pages get accidentally exposed by a configuration change that pushes some internal web server to the DMZ, and I've seen people who want to access an internal resource from their home without going on a VPN, so they build a public-facing version of that script that routes parameters to the internal script.

Bottom line, try and secure your internal scripts and resources just like you would with public ones. It's good practice, and you can incrementally allow special permissions for internal users as necessary.
0
 
xeniumAuthor Commented:
Thanks a lot for all the great info and advice. As there's quite a lot to it, I'll close off this question and link to any branching follow-ups as and when needed.

Thanks again.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Introducing Cloud Class® training courses

Tech changes fast. You can learn faster. That’s why we’re bringing professional training courses to Experts Exchange. With a subscription, you can access all the Cloud Class® courses to expand your education, prep for certifications, and get top-notch instructions.

  • 6
  • 5
Tackle projects and never again get stuck behind a technical roadblock.
Join Now