Using regex to match specific URL

132 views
Skip to first unread message

veni...@gmail.com

unread,
May 4, 2018, 3:34:53 PM5/4/18
to ossec-list
Hello!

In web_rules.xml, there is a rule to ignore error 4XX on pictures / css / js to limit tje number of 4XX false positives. The rule is this one :

<rule id="31102" level="0">
    <if_sid>31101</if_sid>
    <url>.jpg$|.gif$|favicon.ico$|.png$|robots.txt$|.css$|.js$|.jpeg$</url>
    <compiled_rule>is_simple_http_request</compiled_rule>
    <description>Ignored extensions on 400 error codes.</description>
  </rule>


Issue here is that it consider the file extension has the last element in the url. But I got website on my server that do add a version number behind the url and for 404 errors a "/" at the end .... And so I got many false positives ...

I would like to modify this rule to be more "flexible" (using the overwrite system). I am first trying with the version number.

Example : 
XXX.XXX.XXX.XXX - - [04/May/2018:14:14:18 +0200] "GET /files/pictures/brands/logo/40/40-mini.cc3b.jpg?78 HTTP/1.1" 401 381

This one is not matched by rule 31102 because of the "?78". The url tag only support OS_Match/sregex syntax and so I can not change the rule by adding for example ".jpg?(\d)*". I thought to use "regex" instead but it does not work either:

<group name="web,accesslog" >
<rule id="31102" level="0" overwrite="yes">
    <if_sid>31101</if_sid>
    <regex>.jpg?(\d)*</regex>
    <compiled_rule>is_simple_http_request</compiled_rule>
    <description>Ignored extensions on 400 error codes.</description>
  </rule>
</group>
 
Of course, when it will work I will re-add the other file extensions. But for the moment, it's not and I do not understand why :( What did I miss?


Thx in advance!


alberto....@wazuh.com

unread,
May 5, 2018, 6:03:07 AM5/5/18
to ossec-list
Hello

  Did you tried to use the regex like that?

<group name="web,accesslog" >
<rule id="31102" level="0" overwrite="yes">
    <if_sid>31101</if_sid>
    <regex>.jpg?\d+</regex>
    <compiled_rule>is_simple_http_request</compiled_rule>
    <description>Ignored extensions on 400 error codes.</description>
  </rule>
</group>

Documentation:

Hope it help
Best regards, 

Alberto R

veni...@gmail.com

unread,
May 5, 2018, 6:20:43 AM5/5/18
to ossec-list
Yes and it did not match.

What I do not understand as the ossec-regex tool show "matched" if I am not wrong :

# /var/ossec/bin/ossec-regex '.jpg?\d+'
XXX.XXX.XXX.XXX - - [04/May/2018:14:14:18 +0200] "GET /files/pictures/brands/logo/40/40-mini.cc3b.jpg?78 HTTP/1.1" 401 381
+OSRegex_Execute: XXX.XXX.XXX.XXX - - [04/May/2018:14:14:18 +0200] "GET /files/pictures/brands/logo/40/40-mini.cc3b.jpg?78 HTTP/1.1" 401 381
+OS_Regex       : XXX.XXX.XXX.XXX - - [04/May/2018:14:14:18 +0200] "GET /files/pictures/brands/logo/40/40-mini.cc3b.jpg?78 HTTP/1.1" 401 381

 Thx!

alberto....@wazuh.com

unread,
May 5, 2018, 7:07:27 AM5/5/18
to ossec-list
Hello

  Using the following rule: 

<group name="web,accesslog" >
<rule id="31102" level="0" overwrite="yes">
   
<if_sid>31101</if_sid>

   
<regex>.jpg?\d+</regex>
   
<!--<compiled_rule>is_simple_http_request</compiled_rule>-->

   
<description>Ignored extensions on 400 error codes.</description>
 
</rule>
</group>


it works for me, so I think that you need to review the compiled rule if you want to still use it. 

Hope it help
Best regards, 
Alberto R. 

veni...@gmail.com

unread,
May 7, 2018, 4:55:41 AM5/7/18
to ossec-list
Indeed the problem is the compiled rule :

/* Example 4: Checking if a HTTP request is a simple GET/POST without a query * This avoid that we call the attack rules for no reason. */
void *is_simple_http_request(Eventinfo *lf){
   
if (!lf->url) {        return (NULL);    }
   
/* Simple GET / request */    
if (strcmp(lf->url, "/") == 0) {        
return (lf);    
}
   
/* Simple request, no query */    
if (!strchr(lf->url, '?')) {        
return (lf);    
}
   
/* In here, we have an additional query to be checked */  
 
return (NULL);
}

I will remove it. Thx!


On Friday, May 4, 2018 at 9:34:53 PM UTC+2, veni...@gmail.com wrote:
Reply all
Reply to author
Forward
0 new messages