Hi,
I'm trying to integrate cloudwatch with wazuh. the logs are fetched from the logstreams in the specified log groups, but they're not being decoded and not being matched with any rules.
I have been looking on to few discussions and couldn't find a proper solution. These logs are not fetched from any bucket but directly from the log streams. I was sucessfully able to decode logs from cloudtrail s3 bucket but not from logstreams in this case. Only configuration parameters that I've set for this is the following.
<wodle name="aws-s3">
<disabled>no</disabled>
<interval>5m</interval>
<run_on_start>yes</run_on_start>
<service type="cloudwatchlogs">
<aws_profile>default</aws_profile>
<aws_log_groups>example_log_group</aws_log_groups>
<regions>us-east-1</regions>
</service>
</wodle>
the events are decoded by the default json decoder, but since the fields are slightly different (awsRegion instead of data.aws.awsRegion or sourceIPAddress instead of data.aws.sourceIPAddress for example)
and VPC-flow logs aren't decoded since its plain like below
2 123456789010 eni-1235b8ca123456789 172.31.16.139 172.31.16.21 20641 22 6 20 4249 1418530010 1418530070 ACCEPT OK
sent to elasticsearch as
{
"agent": {
"name": "wazuh-manager",
"id": "000"
},
"manager": {
"name": "wazuh-manager"
},
"decoder": {},
"full_log": "2 123456789010 eni-1235b8ca123456789 172.31.16.139 172.31.16.21 20641 22 6 20 4249 1418530010 1418530070 ACCEPT OK",
"input": {
"type": "log"
},
"@timestamp": "2021-05-08T10:20:15.927Z",
"location": "Wazuh-AWS",
"id": "1223456789.6190035",
"timestamp": "2021-05-08T10:20:15.927+0000"
}
Where did I go wrong in this case? Is there any way to properly decode the logs from the logstreams without writing custom decoders?
thanks in advance!