FortiOS 5.6 GrokParseFailure
Posted: Sun Aug 06, 2017 10:44 pm
I have been working on this issue for some time and just spent another day searching, researching, trying something, failing, trying something else, failing, repeat.
I am stopping and asking for assistance.
I have a FortiWiFi 60E running 5.6 FortiOS. I found this forum entry https://support.nagios.com/forum/viewto ... 7&start=10 where a customer used patterns to build a file patterns file for each log message he wanted to parse. I identified three main FortiOS log messages, spent several hours running each one through GROK Debugger (https://grokdebug.herokuapp.com/ to ensure each field would parse correctly.
I then added these entries into a pattern file in the /tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/ directory. The file is called "fortinet".
The contents of this file are:
I know there are many more pattern types in FortiOS (I have the "FortiOS 5.6 Log Reference Guide" PDF). But these are where I wanted to start.
Personally, it is painful to think one would need to build custom patterns for all 30 to 40 message types/subtypes, but I have not been able to figure this out through all the research and testing I have done.
My input entry is:
Fortinet Logs Input (I have only it coming in on UDP port 5566)
My input filter is:
Fortinet Log Filter
I would like to put a "tag_on_failure" to export or capture what is going on with my filters, but I have not been able to figure this out.
My /var/log/logstash/logstash.log file only has the following error messages. I do not know if they reference the Fortinet timestamp, the Apache timestamp, or the IIS timestamp and I do not know how to correlate these messages with the offending filter.
Since fixing the filters the Elasticsearch log file looks clearn:
This is a lab environment. There is nothing critical or important going on with this instance other than me trying to learn how Nagios Log Server parses log files and how to resolve issues. If I can figure these things our we may use it for a production environment we are designing.
Please let me know what other information you might need from me. I will gladly provide anything you might desire or require.
Thank you.
I am stopping and asking for assistance.
I have a FortiWiFi 60E running 5.6 FortiOS. I found this forum entry https://support.nagios.com/forum/viewto ... 7&start=10 where a customer used patterns to build a file patterns file for each log message he wanted to parse. I identified three main FortiOS log messages, spent several hours running each one through GROK Debugger (https://grokdebug.herokuapp.com/ to ensure each field would parse correctly.
I then added these entries into a pattern file in the /tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/ directory. The file is called "fortinet".
The contents of this file are:
Code: Select all
####################################
###Fortinet Syslog Pattern Types:###
####################################
###Date###
FORTIDATE %{YEAR:year}\-%{MONTHNUM:month}\-%{MONTHDAY:day}
####Traffic####
FORTITRAFFIC devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} srcip=%{IPV4:srcip} srcport=%{HOST:srcport} srcintf=\"%{WORD:srcinf}\" srcintfrole=\"%{WORD:srcintfrole}\" dstip=%{IPV4:dstip} dstport=%{HOST:dstport} dstintf=\"%{DATA:dstintf}\" dstintfrole=\"%{WORD:dstinfrole}\" sessionid=%{INT:sessionid} proto=%{INT:proto} action=\"%{WORD:action}\" policyid=%{INT:policyid} policytype=\"%{DATA:policytype}\" service=\"%{WORD:service}\" dstcountry=\"%{WORD:dstcountry}\" srccountry=\"%{WORD:srccountry}\" trandisp=\"%{WORD:transdisp}\" app=\"%{WORD:app}\" duration=%{INT:duration} sentbyte=%{INT:sentbyte} rcvdbyte=%{INT:rcvbyte} sentpkt=%{INT:sentpkt} rcvdpkt=%{INT:rcvdpkt} appcat=\"%{WORD:appcat}\" devtype=\"%{DATA:devtype}\" mastersrcmac=\"%{MAC:masterscrmac}\" srcmac=\"%{MAC:srcmac}\" srcserver=%{INT:srcserver}
###Event###
FORTIEVENT devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} logdesc=\"%{DATA:eventdescription}\" sn=\"%{WORD:serialnumber}\" ap=\"%{WORD:ap}\" ip=\"%{IPV4:ap_ip}\" radioid=%{INT:radioid} radioband=\"%{DATA:radioband}\" bandwidth=\"%{WORD:bandwidth}\" configcountry=\"%{DATA:configcountry}\" opercountry=\"%{DATA:opercountry}\" cfgtxpower=%{INT:cfgtxpower} opertxpower=%{INT:opertxpower} action=\"%{DATA:action}\" msg=\"%{DATA:msg}\"
###FORTIUTM###
FORTIUTM devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" eventtype=\"%{WORD:eventtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} msg=\"%{DATA:msg}\" action=\"%{WORD:action}\" service=\"%{WORD:service}\" sessionid=%{INT:sessionid} srcip=%{IPV4:srcip} dstip=%{IPV4:dstip} srcport=%{HOST:srcport} dstport=%{HOST:dstport} srcintf=\"%{WORD:srcinf}\" srcintfrole=\"%{WORD:srcintfrole}\" dstintf=\"%{DATA:dstintf}\" dstintfrole=\"%{WORD:dstinfrole}\" policyid=%{INT:policyid} proto=%{INT:proto} direction=\"%{WORD:direction}\" url=%{DATA:url} profile=\"%{WORD:profile}\" agent=\"%{DATA:agent}\" analyticscksum=\"%{DATA:analyticscksum}\" analyticssubmit=\"%{WORD:analyticssubmit}\"
Personally, it is painful to think one would need to build custom patterns for all 30 to 40 message types/subtypes, but I have not been able to figure this out through all the research and testing I have done.
My input entry is:
Fortinet Logs Input (I have only it coming in on UDP port 5566)
Code: Select all
syslog {
type => 'FortiLog'
tags => 'FortiLog'
port => 5566
}
Fortinet Log Filter
Code: Select all
if [type] == 'FortiLog' {
grok {
patterns_dir => ["/tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/."]
match => [ 'message', '%{SYSLOG5424PRI}%{FORTIDATE} %{FORTITRAFFIC}' ]
add_tag => "FortiOS_Traffic"
}
}
if [type] == 'FortiLog' {
grok {
patterns_dir => ["/tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/."]
match => [ 'message', '%{SYSLOG5424PRI}%{FORTIDATE} %{FORTIEVENT}' ]
add_tag => "FortiOS_Event"
}
}
if [type] == 'FortiLog' {
grok {
patterns_dir => ["/tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/."]
match => [ 'message', '%{SYSLOG5424PRI}%{FORTIDATE} %{FORTIUTM}' ]
add_tag => "FortiOS_UTM"
}
}
My /var/log/logstash/logstash.log file only has the following error messages. I do not know if they reference the Fortinet timestamp, the Apache timestamp, or the IIS timestamp and I do not know how to correlate these messages with the offending filter.
Code: Select all
{:timestamp=>"2017-08-06T22:29:27.895000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:29:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:29:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:29:37.905000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:29:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:29:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:29:37.906000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:29:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:29:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:30:27.956000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:30:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:30:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:30:27.957000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:30:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:30:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:30:37.966000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:30:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:30:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:30:37.967000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:30:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:30:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:31:27.993000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:31:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:31:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:31:27.993000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:31:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:31:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:31:38.002000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:31:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:31:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:31:38.003000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:31:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:31:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:32:28.043000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:32:28", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:32:28", :level=>:warn}
{:timestamp=>"2017-08-06T22:32:28.044000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:32:28", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:32:28", :level=>:warn}
{:timestamp=>"2017-08-06T22:32:38.050000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:32:38", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:32:38", :level=>:warn}
{:timestamp=>"2017-08-06T22:32:38.051000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:32:38", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:32:38", :level=>:warn}
{:timestamp=>"2017-08-06T22:33:28.095000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:33:28", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:33:28", :level=>:warn}
{:timestamp=>"2017-08-06T22:33:28.096000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug 6 22:33:28", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug 6 22:33:28", :level=>:warn}
Code: Select all
[2017-08-06 22:15:41,192][INFO ][node ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] version[1.6.0], pid[1203], build[cdd3ac4/2015-06-09T13:36:34Z]
[2017-08-06 22:15:41,192][INFO ][node ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] initializing ...
[2017-08-06 22:15:41,213][INFO ][plugins ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] loaded [knapsack-1.5.2.0-f340ad1], sites []
[2017-08-06 22:15:41,284][INFO ][env ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] using [1] data paths, mounts [[/ (rootfs)]], net usable_space [22.4gb], net total_space [25.6gb], types [rootfs]
[2017-08-06 22:15:45,659][INFO ][node ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] initialized
[2017-08-06 22:15:45,660][INFO ][node ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] starting ...
[2017-08-06 22:15:45,796][INFO ][transport ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] bound_address {inet[/0:0:0:0:0:0:0:0:9300]}, publish_address {inet[/10.50.50.160:9300]}
[2017-08-06 22:15:45,811][INFO ][discovery ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] 8f049eab-93dd-4609-9daf-d7e05f448a63/rJb_XqjeRjWlFbMr6vFyng
[2017-08-06 22:15:48,917][INFO ][cluster.service ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] new_master [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5][rJb_XqjeRjWlFbMr6vFyng][lola.bloomcounty.tech][inet[/10.50.50.160:9300]]{max_local_storage_nodes=1}, reason: zen-disco-join (elected_as_master)
[2017-08-06 22:15:49,151][INFO ][http ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] bound_address {inet[/127.0.0.1:9200]}, publish_address {inet[localhost/127.0.0.1:9200]}
[2017-08-06 22:15:49,152][INFO ][node ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] started
[2017-08-06 22:15:49,295][INFO ][gateway ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] recovered [5] indices into cluster_state
[2017-08-06 22:21:52,359][INFO ][cluster.metadata ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] [logstash-2017.08.07] update_mapping [eventlog] (dynamic)
Please let me know what other information you might need from me. I will gladly provide anything you might desire or require.
Thank you.