FortiOS 5.6 GrokParseFailure

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
oliverwjones
Posts: 5
Joined: Fri Jul 21, 2017 2:02 pm

FortiOS 5.6 GrokParseFailure

Post by oliverwjones »

I have been working on this issue for some time and just spent another day searching, researching, trying something, failing, trying something else, failing, repeat.

I am stopping and asking for assistance.

I have a FortiWiFi 60E running 5.6 FortiOS. I found this forum entry https://support.nagios.com/forum/viewto ... 7&start=10 where a customer used patterns to build a file patterns file for each log message he wanted to parse. I identified three main FortiOS log messages, spent several hours running each one through GROK Debugger (https://grokdebug.herokuapp.com/ to ensure each field would parse correctly.

I then added these entries into a pattern file in the /tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/ directory. The file is called "fortinet".

The contents of this file are:

Code: Select all

####################################
###Fortinet Syslog Pattern Types:###
####################################

###Date###
FORTIDATE %{YEAR:year}\-%{MONTHNUM:month}\-%{MONTHDAY:day}

####Traffic####
FORTITRAFFIC devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} srcip=%{IPV4:srcip} srcport=%{HOST:srcport} srcintf=\"%{WORD:srcinf}\" srcintfrole=\"%{WORD:srcintfrole}\" dstip=%{IPV4:dstip} dstport=%{HOST:dstport} dstintf=\"%{DATA:dstintf}\" dstintfrole=\"%{WORD:dstinfrole}\" sessionid=%{INT:sessionid} proto=%{INT:proto} action=\"%{WORD:action}\" policyid=%{INT:policyid} policytype=\"%{DATA:policytype}\" service=\"%{WORD:service}\" dstcountry=\"%{WORD:dstcountry}\" srccountry=\"%{WORD:srccountry}\" trandisp=\"%{WORD:transdisp}\" app=\"%{WORD:app}\" duration=%{INT:duration} sentbyte=%{INT:sentbyte} rcvdbyte=%{INT:rcvbyte} sentpkt=%{INT:sentpkt} rcvdpkt=%{INT:rcvdpkt} appcat=\"%{WORD:appcat}\" devtype=\"%{DATA:devtype}\" mastersrcmac=\"%{MAC:masterscrmac}\" srcmac=\"%{MAC:srcmac}\" srcserver=%{INT:srcserver}

###Event###
FORTIEVENT devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} logdesc=\"%{DATA:eventdescription}\" sn=\"%{WORD:serialnumber}\" ap=\"%{WORD:ap}\" ip=\"%{IPV4:ap_ip}\" radioid=%{INT:radioid} radioband=\"%{DATA:radioband}\" bandwidth=\"%{WORD:bandwidth}\" configcountry=\"%{DATA:configcountry}\" opercountry=\"%{DATA:opercountry}\" cfgtxpower=%{INT:cfgtxpower} opertxpower=%{INT:opertxpower} action=\"%{DATA:action}\" msg=\"%{DATA:msg}\"


###FORTIUTM###
FORTIUTM devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" eventtype=\"%{WORD:eventtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} msg=\"%{DATA:msg}\" action=\"%{WORD:action}\" service=\"%{WORD:service}\" sessionid=%{INT:sessionid} srcip=%{IPV4:srcip} dstip=%{IPV4:dstip} srcport=%{HOST:srcport} dstport=%{HOST:dstport} srcintf=\"%{WORD:srcinf}\" srcintfrole=\"%{WORD:srcintfrole}\" dstintf=\"%{DATA:dstintf}\" dstintfrole=\"%{WORD:dstinfrole}\" policyid=%{INT:policyid} proto=%{INT:proto} direction=\"%{WORD:direction}\" url=%{DATA:url} profile=\"%{WORD:profile}\" agent=\"%{DATA:agent}\" analyticscksum=\"%{DATA:analyticscksum}\" analyticssubmit=\"%{WORD:analyticssubmit}\"
I know there are many more pattern types in FortiOS (I have the "FortiOS 5.6 Log Reference Guide" PDF). But these are where I wanted to start.

Personally, it is painful to think one would need to build custom patterns for all 30 to 40 message types/subtypes, but I have not been able to figure this out through all the research and testing I have done.

My input entry is:
Fortinet Logs Input (I have only it coming in on UDP port 5566)

Code: Select all

syslog {
    type => 'FortiLog'
    tags => 'FortiLog'
    port => 5566
}
My input filter is:
Fortinet Log Filter

Code: Select all

if [type] == 'FortiLog' {
  grok {
     patterns_dir => ["/tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/."]
      match => [ 'message', '%{SYSLOG5424PRI}%{FORTIDATE} %{FORTITRAFFIC}' ]
      add_tag => "FortiOS_Traffic"
  }
}
if [type] == 'FortiLog' {
  grok {
     patterns_dir => ["/tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/."]
      match => [ 'message', '%{SYSLOG5424PRI}%{FORTIDATE} %{FORTIEVENT}' ]
       add_tag => "FortiOS_Event"
  }
}
if [type] == 'FortiLog' {
  grok {
       patterns_dir => ["/tmp/nagioslogserver/subcomponents/logstash/logstash-1.5.1/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-0.1.10/patterns/."]
       match => [ 'message', '%{SYSLOG5424PRI}%{FORTIDATE} %{FORTIUTM}' ]
       add_tag => "FortiOS_UTM"
  }
}
I would like to put a "tag_on_failure" to export or capture what is going on with my filters, but I have not been able to figure this out.

My /var/log/logstash/logstash.log file only has the following error messages. I do not know if they reference the Fortinet timestamp, the Apache timestamp, or the IIS timestamp and I do not know how to correlate these messages with the offending filter.

Code: Select all

{:timestamp=>"2017-08-06T22:29:27.895000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:29:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:29:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:29:37.905000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:29:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:29:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:29:37.906000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:29:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:29:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:30:27.956000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:30:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:30:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:30:27.957000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:30:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:30:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:30:37.966000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:30:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:30:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:30:37.967000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:30:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:30:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:31:27.993000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:31:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:31:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:31:27.993000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:31:27", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:31:27", :level=>:warn}
{:timestamp=>"2017-08-06T22:31:38.002000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:31:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:31:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:31:38.003000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:31:37", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:31:37", :level=>:warn}
{:timestamp=>"2017-08-06T22:32:28.043000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:32:28", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:32:28", :level=>:warn}
{:timestamp=>"2017-08-06T22:32:28.044000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:32:28", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:32:28", :level=>:warn}
{:timestamp=>"2017-08-06T22:32:38.050000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:32:38", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:32:38", :level=>:warn}
{:timestamp=>"2017-08-06T22:32:38.051000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:32:38", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:32:38", :level=>:warn}
{:timestamp=>"2017-08-06T22:33:28.095000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:33:28", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:33:28", :level=>:warn}
{:timestamp=>"2017-08-06T22:33:28.096000-0500", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"Aug  6 22:33:28", :exception=>java.lang.IllegalArgumentException: Invalid format: "Aug  6 22:33:28", :level=>:warn}
Since fixing the filters the Elasticsearch log file looks clearn:

Code: Select all

[2017-08-06 22:15:41,192][INFO ][node                     ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] version[1.6.0], pid[1203], build[cdd3ac4/2015-06-09T13:36:34Z]
[2017-08-06 22:15:41,192][INFO ][node                     ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] initializing ...
[2017-08-06 22:15:41,213][INFO ][plugins                  ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] loaded [knapsack-1.5.2.0-f340ad1], sites []
[2017-08-06 22:15:41,284][INFO ][env                      ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] using [1] data paths, mounts [[/ (rootfs)]], net usable_space [22.4gb], net total_space [25.6gb], types [rootfs]
[2017-08-06 22:15:45,659][INFO ][node                     ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] initialized
[2017-08-06 22:15:45,660][INFO ][node                     ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] starting ...
[2017-08-06 22:15:45,796][INFO ][transport                ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] bound_address {inet[/0:0:0:0:0:0:0:0:9300]}, publish_address {inet[/10.50.50.160:9300]}
[2017-08-06 22:15:45,811][INFO ][discovery                ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] 8f049eab-93dd-4609-9daf-d7e05f448a63/rJb_XqjeRjWlFbMr6vFyng
[2017-08-06 22:15:48,917][INFO ][cluster.service          ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] new_master [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5][rJb_XqjeRjWlFbMr6vFyng][lola.bloomcounty.tech][inet[/10.50.50.160:9300]]{max_local_storage_nodes=1}, reason: zen-disco-join (elected_as_master)
[2017-08-06 22:15:49,151][INFO ][http                     ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] bound_address {inet[/127.0.0.1:9200]}, publish_address {inet[localhost/127.0.0.1:9200]}
[2017-08-06 22:15:49,152][INFO ][node                     ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] started
[2017-08-06 22:15:49,295][INFO ][gateway                  ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] recovered [5] indices into cluster_state
[2017-08-06 22:21:52,359][INFO ][cluster.metadata         ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] [logstash-2017.08.07] update_mapping [eventlog] (dynamic)
This is a lab environment. There is nothing critical or important going on with this instance other than me trying to learn how Nagios Log Server parses log files and how to resolve issues. If I can figure these things our we may use it for a production environment we are designing.

Please let me know what other information you might need from me. I will gladly provide anything you might desire or require.

Thank you.
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: FortiOS 5.6 GrokParseFailure

Post by mcapra »

Can you share some samples of the raw log files being shipped from the FortiOS machine to the FortiLog input? Also the rsyslog configuration you are using to ship those logs?

Have you also tried changing the input plugin used from syslog to tcp? Like so:

Code: Select all

tcp {
    type => 'FortiLog'
    tags => 'FortiLog'
    port => 5566
}
Typically, failures relating to dates can be attributed to the syslog input plugin being weird. Depending on how your FortiOS system is presenting it's timestamps, a date filter may need to be applied in addition to the grok filter in order to set the timestamp to the fields created by the FORTIDATE pattern rather than leaving the evaluation of timestamp up to the syslog input plugin.
Former Nagios employee
https://www.mcapra.com/
dwhitfield
Former Nagios Staff
Posts: 4583
Joined: Wed Sep 21, 2016 10:29 am
Location: NoLo, Minneapolis, MN
Contact:

Re: FortiOS 5.6 GrokParseFailure

Post by dwhitfield »

Thanks @mcapra!

OP, please let us know if you have any additional questions.
oliverwjones
Posts: 5
Joined: Fri Jul 21, 2017 2:02 pm

Re: FortiOS 5.6 GrokParseFailure

Post by oliverwjones »

I updated my FORTIDATE pattern. I am now getting tons of output into the Logstash.log file.

This is my input script:

Code: Select all

tcp {
    type => 'FortiLog'
    tags => 'FortiLog'
    port => 5566
    }
    udp {
         type => 'FortiLog'
         tags => 'FortiLog'
         port => 5566
        }
By moving from "syslog" to "tcp/udp" it fixed one GROK error, but I am still getting _grokparsefailure in my output.


Here are the three messages I built patterns for:

Code: Select all

##Traffic Local Subtype##
<189>date=2017-08-06 time=19:08:42 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid="0001000014" type="traffic" subtype="local" level="notice" vd="root" logtime=1502064365 srcip=10.0.0.165 srcport=59728 srcintf="lan" srcintfrole="lan" dstip=10.0.0.1 dstport=161 dstintf="root" dstintfrole="undefined" sessionid=2367579 proto=17 action="accept" policyid=0 policytype="local-in-policy" service="SNMP" dstcountry="Reserved" srccountry="Reserved" trandisp="noop" app="SNMP" duration=180 sentbyte=0 rcvdbyte=341 sentpkt=0 rcvdpkt=1 appcat="unscanned" devtype="Windows PC" mastersrcmac="00:AA:BB:CC:DD:EE" srcmac="00:AA:BB:CC:DD:EE" srcserver=0

Input Filter
FORTITRAFFIC devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} srcip=%{IPV4:srcip} srcport=%{HOST:srcport} srcintf=\"%{WORD:srcinf}\" srcintfrole=\"%{WORD:srcintfrole}\" dstip=%{IPV4:dstip} dstport=%{HOST:dstport} dstintf=\"%{DATA:dstintf}\" dstintfrole=\"%{WORD:dstinfrole}\" sessionid=%{INT:sessionid} proto=%{INT:proto} action=\"%{WORD:action}\" policyid=%{INT:policyid} policytype=\"%{DATA:policytype}\" service=\"%{WORD:service}\" dstcountry=\"%{WORD:dstcountry}\" srccountry=\"%{WORD:srccountry}\" trandisp=\"%{WORD:transdisp}\" app=\"%{WORD:app}\" duration=%{INT:duration} sentbyte=%{INT:sentbyte} rcvdbyte=%{INT:rcvbyte} sentpkt=%{INT:sentpkt} rcvdpkt=%{INT:rcvdpkt} appcat=\"%{WORD:appcat}\" devtype=\"%{DATA:devtype}\" mastersrcmac=\"%{MAC:masterscrmac}\" srcmac=\"%{MAC:srcmac}\" srcserver=%{INT:srcserver}

##Event Wireless Subtype##
<189>date=2017-08-06 time=19:08:42 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid="0104043594" type="event" subtype="wireless" level="notice" vd="root" logtime=1502064522 logdesc="Physical AP radio oper TX power" sn="FP221C3X12345678" ap="AP221a" ip="10.0.0.110" radioid=1 radioband="802.11n" bandwidth="20MHz" configcountry="US " opercountry="US " cfgtxpower=13 opertxpower=13 action="oper-txpower" msg="AP AP221a radio 1 oper txpower is changed to 13 dBm."

Input Filter
FORTIEVENT devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} logdesc=\"%{DATA:eventdescription}\" sn=\"%{WORD:serialnumber}\" ap=\"%{WORD:ap}\" ip=\"%{IPV4:ap_ip}\" radioid=%{INT:radioid} radioband=\"%{DATA:radioband}\" bandwidth=\"%{WORD:bandwidth}\" configcountry=\"%{DATA:configcountry}\" opercountry=\"%{DATA:opercountry}\" cfgtxpower=%{INT:cfgtxpower} opertxpower=%{INT:opertxpower} action=\"%{DATA:action}\" msg=\"%{DATA:msg}\"


##UTM/VIrus Subtype##
<189>date=2017-08-06 time=19:08:42 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid="0201009233" type="utm" subtype="virus" eventtype="analytics" level="information" vd="root" logtime=1502066231 msg="File submitted to Sandbox." action="analytics" service="HTTP" sessionid=2379438 srcip=10.0.0.165 dstip=172.217.9.132 srcport=4926 dstport=80 srcintf="lan" srcintfrole="lan" dstintf="wan1" dstintfrole="wan" policyid=1 proto=6 direction="incoming" url="http://www.google.com/"; profile="default" agent="Mozilla/5.0" analyticscksum="260d4eb38d9aadb1d8cd3f556447d83c9032b96baa607758accbbb3c9dcdaa6f" analyticssubmit="true"

Input Filter Pattern
FORTIUTM devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" eventtype=\"%{WORD:eventtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} msg=\"%{DATA:msg}\" action=\"%{WORD:action}\" service=\"%{WORD:service}\" sessionid=%{INT:sessionid} srcip=%{IPV4:srcip} dstip=%{IPV4:dstip} srcport=%{HOST:srcport} dstport=%{HOST:dstport} srcintf=\"%{WORD:srcinf}\" srcintfrole=\"%{WORD:srcintfrole}\" dstintf=\"%{DATA:dstintf}\" dstintfrole=\"%{WORD:dstinfrole}\" policyid=%{INT:policyid} proto=%{INT:proto} direction=\"%{WORD:direction}\" url=%{DATA:url} profile=\"%{WORD:profile}\" agent=\"%{DATA:agent}\" analyticscksum=\"%{DATA:analyticscksum}\" analyticssubmit=\"%{WORD:analyticssubmit}\"
Here are the patterns. I worked on the "time" variable today.

Code: Select all

####################################
###Fortinet Syslog Pattern Types:###
####################################

###Date###
FORTIDATE date=%{YEAR}[-]%{MONTHNUM}[-]%{MONTHDAY} time=%{TIME}

####Traffic####
FORTITRAFFIC devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} srcip=%{IPV4:srcip} srcport=%{HOST:srcport} srcintf=\"%{WORD:srcinf}\" srcintfrole=\"%{WORD:srcintfrole}\" dstip=%{IPV4:dstip} dstport=%{HOST:dstport} dstintf=\"%{DATA:dstintf}\" dstintfrole=\"%{WORD:dstinfrole}\" sessionid=%{INT:sessionid} proto=%{INT:proto} action=\"%{WORD:action}\" policyid=%{INT:policyid} policytype=\"%{DATA:policytype}\" service=\"%{WORD:service}\" dstcountry=\"%{WORD:dstcountry}\" srccountry=\"%{WORD:srccountry}\" trandisp=\"%{WORD:transdisp}\" app=\"%{WORD:app}\" duration=%{INT:duration} sentbyte=%{INT:sentbyte} rcvdbyte=%{INT:rcvbyte} sentpkt=%{INT:sentpkt} rcvdpkt=%{INT:rcvdpkt} appcat=\"%{WORD:appcat}\" devtype=\"%{DATA:devtype}\" mastersrcmac=\"%{MAC:masterscrmac}\" srcmac=\"%{MAC:srcmac}\" srcserver=%{INT:srcserver}

###Event###
FORTIEVENT devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} logdesc=\"%{DATA:eventdescription}\" sn=\"%{WORD:serialnumber}\" ap=\"%{WORD:ap}\" ip=\"%{IPV4:ap_ip}\" radioid=%{INT:radioid} radioband=\"%{DATA:radioband}\" bandwidth=\"%{WORD:bandwidth}\" configcountry=\"%{DATA:configcountry}\" opercountry=\"%{DATA:opercountry}\" cfgtxpower=%{INT:cfgtxpower} opertxpower=%{INT:opertxpower} action=\"%{DATA:action}\" msg=\"%{DATA:msg}\"


###FORTIUTM###
FORTIUTM devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=\"%{INT:logid}\" type=\"%{WORD:type}\" subtype=\"%{WORD:subtype}\" eventtype=\"%{WORD:eventtype}\" level=\"%{WORD:level}\" vd=\"%{WORD:vdom}\" logtime=%{INT:logtime} msg=\"%{DATA:msg}\" action=\"%{WORD:action}\" service=\"%{WORD:service}\" sessionid=%{INT:sessionid} srcip=%{IPV4:srcip} dstip=%{IPV4:dstip} srcport=%{HOST:srcport} dstport=%{HOST:dstport} srcintf=\"%{WORD:srcinf}\" srcintfrole=\"%{WORD:srcintfrole}\" dstintf=\"%{DATA:dstintf}\" dstintfrole=\"%{WORD:dstinfrole}\" policyid=%{INT:policyid} proto=%{INT:proto} direction=\"%{WORD:direction}\" url=%{DATA:url} profile=\"%{WORD:profile}\" agent=\"%{DATA:agent}\" analyticscksum=\"%{DATA:analyticscksum}\" analyticssubmit=\"%{WORD:analyticssubmit}\"
Here is a couple screenshots of the NLS webGUI:
https://mega.nz/#!PtJA2KKY!mPiFHw9fGE_a ... d18We8kWuc

https://mega.nz/#!255mkCxY!40Jts1I3cmku ... jukfF9D0H4


Here is some output from the Logstash.log file:

Code: Select all

 \"@version\"=>\"1\", \"@timestamp\"=>\"2017-08-07T20:10:42.653Z\", \"type\"=>[\"FortiLog\", \"event\"], \"tags\"=>[\"FortiLog\", \"_grokparsefailure\", \"FortiOS_Event\"], \"host\"=>\"10.50.50.1\", \"syslog5424_pri\"=>\"189\", \"devname\"=>\"FWF60E1A12345678\", \"devid\"=>\"FWF60E1A12345678\", \"logid\"=>\"0104043594\", \"subtype\"=>\"wireless\", \"level\"=>\"notice\", \"vdom\"=>\"root\", \"logtime\"=>\"1502136642\", \"eventdescription\"=>\"Physical AP radio oper TX power\", \"serialnumber\"=>\"FP221C3X12345678\", \"ap\"=>\"AP221U\", \"ap_ip\"=>\"10.50.50.111\", \"radioid\"=>\"1\", \"radioband\"=>\"802.11n\", \"bandwidth\"=>\"20MHz\", \"configcountry\"=>\"US \", \"opercountry\"=>\"US \", \"cfgtxpower\"=>\"8\", \"opertxpower\"=>\"8\", \"action\"=>\"oper-txpower\", \"msg\"=>\"AP AP221U radio 1 oper txpower is changed to 8 dBm.\"}, \"type\"], \"tags\"=>[{\"message\"=>\"<189>date=2017-08-07 time=15:10:42 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid=\\\"0104043594\\\" type=\\\"event\\\" subtype=\\\"wireless\\\" level=\\\"notice\\\" vd=\\\"root\\\" logtime=1502136642 logdesc=\\\"Physical AP radio oper TX power\\\" sn=\\\"FP221C3X12345678\\\" ap=\\\"AP221U\\\" ip=\\\"10.50.50.111\\\" radioid=1 radioband=\\\"802.11n\\\" bandwidth=\\\"20MHz\\\" configcountry=\\\"US \\\" opercountry=\\\"US \\\" cfgtxpower=8 opertxpower=8 action=\\\"oper-txpower\\\" msg=\\\"AP AP221U radio 1 oper txpower is changed to 8 dBm.\\\"\", \"@version\"=>\"1\", \"@timestamp\"=>\"2017-08-07T20:10:42.653Z\", \"type\"=>[\"FortiLog\", \"event\"], \"tags\"=>[\"FortiLog\", \"_grokparsefailure\", \"FortiOS_Event\"], \"host\"=>\"10.50.50.1\", \"syslog5424_pri\"=>\"189\", \"devname\"=>\"FWF60E1A12345678\", \"devid\"=>\"FWF60E1A12345678\", \"logid\"=>\"0104043594\", \"subtype\"=>\"wireless\", \"level\"=>\"notice\", \"vdom\"=>\"root\", \"logtime\"=>\"1502136642\", \"eventdescription\"=>\"Physical AP radio oper TX power\", \"serialnumber\"=>\"FP221C3X12345678\", \"ap\"=>\"AP221U\", \"ap_ip\"=>\"10.50.50.111\", \"radioid\"=>\"1\", \"radioband\"=>\"802.11n\", \"bandwidth\"=>\"20MHz\", \"configcountry\"=>\"US \", \"opercountry\"=>\"US \", \"cfgtxpower\"=>\"8\", \"opertxpower\"=>\"8\", \"action\"=>\"oper-txpower\", \"msg\"=>\"AP AP221U radio 1 oper txpower is changed to 8 dBm.\"}, \"tags\"], \"host\"=>[{\"message\"=>\"<189>date=2017-08-07 time=15:10:42 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid=\\\"0104043594\\\" type=\\\"event\\\" subtype=\\\"wireless\\\" level=\\\"notice\\\" vd=\\\"root\\\" logtime=1502136642 logdesc=\\\"Physical AP radio oper TX power\\\" sn=\\\"FP221C3X12345678\\\" ap=\\\"AP221U\\\" ip=\\\"10.50.50.111\\\" radioid=1 radioband=\\\"802.11n\\\" bandwidth=\\\"20MHz\\\" configcountry=\\\"US \\\" opercountry=\\\"US \\\" cfgtxpower=8 opertxpower=8 action=\\\"oper-txpower\\\" msg=\\\"AP AP221U radio 1 oper txpower is changed to 8 dBm.\\\"\", \"@version\"=>\"1\", \"@timestamp\"=>\"2017-08-07T20:10:42.653Z\", \"type\"=>[\"FortiLog\", \"event\"], \"tags\"=>[\"FortiLog\", \"_grokparsefailure\", \"FortiOS_Event\"], \"host\"=>\"10.50.50.1\", \"syslog5424_pri\"=>\"189\", \"devname\"=>\"FWF60E1A12345678\", \"devid\"=>\"FWF60E1A12345678\", \"logid\"=>\"0104043594\", \"subtype\"=>\"wireless\", \"level\"=>\"notice\", \"vdom\"=>\"root\", \"logtime\"=>\"1502136642\", \"eventdescription\"=>\"Physical AP radio oper TX power\", \"serialnumber\"=>\"FP221C3X12345678\", \"ap\"=>\"AP221U\", \"ap_ip\"=>\"10.50.50.111\", \"radioid\"=>\"1\", \"radioband\"=>\"802.11n\", \"bandwidth\"=>\"20MHz\", \"configcountry\"=>\"US \", \"opercountry\"=>\"US \", \"cfgtxpower\"=>\"8\", \"opertxpower\"=>\"8\", \"action\"=>\"oper-txpower\", \"msg\"=>\"AP AP221U radio 1 oper txpower is changed to 8 dBm.\"}, \"host\"], \"[program]\"=>[{\"message\"=>\"<189>date=2017-08-07 time=15:10:42 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid=\\\"0104043594\\\" type=\\\"event\\\" subtype=\\\"wireless\\\" level=\\\"notice\\\" vd=\\\"root\\\" logtime=1502136642 logdesc=\\\"Physical AP radio oper TX power\\\" sn=\\\"FP221C3X12345678\\\" ap=\\\"AP221U\\\" ip=\\\"10.50.50.111\\\" radioid=1 radioband=\\\"802.11n\\\" bandwidth=\\\"20MHz\\\" configcountry=\\\"US \\\" opercountry=\\\"US \\\" cfgtxpower=8 opertxpower=8 action=\\\"oper-txpower\\\" msg=\\\"AP AP221U radio 1 oper txpower is changed to 8 dBm.\\\"\", \"@version\"=>\"1\", 
Here is output from the Elasticsearch log file:

Code: Select all

[2017-08-07 15:15:09,279][DEBUG][action.bulk              ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] [logstash-2017.08.07][3] failed to execute bulk item (index) index {[logstash-2017.08.07][FortiLog,utm][AV2-Vj3cLQ6B3RquaXHj], source[{"message":"<190>date=2017-08-07 time=15:15:08 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid=\"0201009233\" type=\"utm\" subtype=\"virus\" eventtype=\"analytics\" level=\"information\" vd=\"root\" logtime=1502136908 msg=\"File submitted to Sandbox.\" action=\"analytics\" service=\"HTTP\" sessionid=2859933 srcip=10.50.50.165 dstip=172.217.1.228 srcport=43464 dstport=80 srcintf=\"lan\" srcintfrole=\"lan\" dstintf=\"wan1\" dstintfrole=\"wan\" policyid=1 proto=6 direction=\"incoming\" url=\"http://www.google.com/\" profile=\"default\" agent=\"Mozilla/5.0\" analyticscksum=\"622681d954e52ac387405274f3dfaf699b58cd07a9f2434eed2384e007ef76e1\" analyticssubmit=\"true\"","@version":"1","@timestamp":"2017-08-07T20:15:09.062Z","type":["FortiLog","utm"],"tags":["FortiLog","_grokparsefailure","FortiOS_UTM"],"host":"10.50.50.1","syslog5424_pri":"190","devname":"FWF60E1A12345678","devid":"FWF60E1A12345678","logid":"0201009233","subtype":"virus","eventtype":"analytics","level":"information","vdom":"root","logtime":"1502136908","msg":"File submitted to Sandbox.","action":"analytics","service":"HTTP","sessionid":"2859933","srcip":"10.50.50.165","dstip":"172.217.1.228","srcport":"43464","dstport":"80","srcinf":"lan","srcintfrole":"lan","dstintf":"wan1","dstinfrole":"wan","policyid":"1","proto":"6","direction":"incoming","url":"\"http://www.google.com/\"","profile":"default","agent":"Mozilla/5.0","analyticscksum":"622681d954e52ac387405274f3dfaf699b58cd07a9f2434eed2384e007ef76e1","analyticssubmit":"true"}]}
org.elasticsearch.indices.InvalidTypeNameException: mapping type name [FortiLog,utm] should not include ',' in it
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:325)
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:307)
        at org.elasticsearch.index.mapper.MapperService.documentMapperWithAutoCreate(MapperService.java:482)
        at org.elasticsearch.index.shard.IndexShard.prepareCreate(IndexShard.java:464)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:418)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:148)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase.performOnPrimary(TransportShardReplicationOperationAction.java:574)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase$1.doRun(TransportShardReplicationOperationAction.java:440)
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:36)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:748)
[root@lola elasticsearch]# tail -n 40 8f049eab-93dd-4609-9daf-d7e05f448a63.log
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:325)
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:307)
        at org.elasticsearch.index.mapper.MapperService.documentMapperWithAutoCreate(MapperService.java:482)
        at org.elasticsearch.index.shard.IndexShard.prepareCreate(IndexShard.java:464)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:418)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:148)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase.performOnPrimary(TransportShardReplicationOperationAction.java:574)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase$1.doRun(TransportShardReplicationOperationAction.java:440)
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:36)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:748)
[2017-08-07 15:15:07,241][DEBUG][action.bulk              ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] [logstash-2017.08.07][1] failed to execute bulk item (index) index {[logstash-2017.08.07][FortiLog,traffic][AV2-VjXlLQ6B3RquaXHV], source[{"message":"<189>date=2017-08-07 time=15:15:06 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid=\"0001000014\" type=\"traffic\" subtype=\"local\" level=\"notice\" vd=\"root\" logtime=1502136906 srcip=10.50.50.165 srcport=50562 srcintf=\"lan\" srcintfrole=\"lan\" dstip=10.50.50.1 dstport=161 dstintf=\"root\" dstintfrole=\"undefined\" sessionid=2857578 proto=17 action=\"accept\" policyid=0 policytype=\"local-in-policy\" service=\"SNMP\" dstcountry=\"Reserved\" srccountry=\"Reserved\" trandisp=\"noop\" app=\"SNMP\" duration=334 sentbyte=3003 rcvdbyte=3525 sentpkt=11 rcvdpkt=12 appcat=\"unscanned\" devtype=\"Windows PC\" mastersrcmac=\"00:03:e1:91:03:63\" srcmac=\"00:03:e1:91:03:63\" srcserver=0","@version":"1","@timestamp":"2017-08-07T20:15:06.758Z","type":["FortiLog","traffic"],"tags":["FortiLog","FortiOS_Traffic"],"host":"10.50.50.1","syslog5424_pri":"189","devname":"FWF60E1A12345678","devid":"FWF60E1A12345678","logid":"0001000014","subtype":"local","level":"notice","vdom":"root","logtime":"1502136906","srcip":"10.50.50.165","srcport":"50562","srcinf":"lan","srcintfrole":"lan","dstip":"10.50.50.1","dstport":"161","dstintf":"root","dstinfrole":"undefined","sessionid":"2857578","proto":"17","action":"accept","policyid":"0","policytype":"local-in-policy","service":"SNMP","dstcountry":"Reserved","srccountry":"Reserved","transdisp":"noop","app":"SNMP","duration":"334","sentbyte":"3003","rcvbyte":"3525","sentpkt":"11","rcvdpkt":"12","appcat":"unscanned","devtype":"Windows PC","masterscrmac":"00:03:e1:91:03:63","srcmac":"00:03:e1:91:03:63","srcserver":"0"}]}
org.elasticsearch.indices.InvalidTypeNameException: mapping type name [FortiLog,traffic] should not include ',' in it
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:325)
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:307)
        at org.elasticsearch.index.mapper.MapperService.documentMapperWithAutoCreate(MapperService.java:482)
        at org.elasticsearch.index.shard.IndexShard.prepareCreate(IndexShard.java:464)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:418)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:148)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase.performOnPrimary(TransportShardReplicationOperationAction.java:574)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase$1.doRun(TransportShardReplicationOperationAction.java:440)
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:36)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:748)
[2017-08-07 15:15:09,279][DEBUG][action.bulk              ] [d7bb647d-ef7c-4ef0-905a-da117ba8b5e5] [logstash-2017.08.07][3] failed to execute bulk item (index) index {[logstash-2017.08.07][FortiLog,utm][AV2-Vj3cLQ6B3RquaXHj], source[{"message":"<190>date=2017-08-07 time=15:15:08 devname=FWF60E1A12345678 devid=FWF60E1A12345678 logid=\"0201009233\" type=\"utm\" subtype=\"virus\" eventtype=\"analytics\" level=\"information\" vd=\"root\" logtime=1502136908 msg=\"File submitted to Sandbox.\" action=\"analytics\" service=\"HTTP\" sessionid=2859933 srcip=10.50.50.165 dstip=172.217.1.228 srcport=43464 dstport=80 srcintf=\"lan\" srcintfrole=\"lan\" dstintf=\"wan1\" dstintfrole=\"wan\" policyid=1 proto=6 direction=\"incoming\" url=\"http://www.google.com/\" profile=\"default\" agent=\"Mozilla/5.0\" analyticscksum=\"622681d954e52ac387405274f3dfaf699b58cd07a9f2434eed2384e007ef76e1\" analyticssubmit=\"true\"","@version":"1","@timestamp":"2017-08-07T20:15:09.062Z","type":["FortiLog","utm"],"tags":["FortiLog","_grokparsefailure","FortiOS_UTM"],"host":"10.50.50.1","syslog5424_pri":"190","devname":"FWF60E1A12345678","devid":"FWF60E1A12345678","logid":"0201009233","subtype":"virus","eventtype":"analytics","level":"information","vdom":"root","logtime":"1502136908","msg":"File submitted to Sandbox.","action":"analytics","service":"HTTP","sessionid":"2859933","srcip":"10.50.50.165","dstip":"172.217.1.228","srcport":"43464","dstport":"80","srcinf":"lan","srcintfrole":"lan","dstintf":"wan1","dstinfrole":"wan","policyid":"1","proto":"6","direction":"incoming","url":"\"http://www.google.com/\"","profile":"default","agent":"Mozilla/5.0","analyticscksum":"622681d954e52ac387405274f3dfaf699b58cd07a9f2434eed2384e007ef76e1","analyticssubmit":"true"}]}
org.elasticsearch.indices.InvalidTypeNameException: mapping type name [FortiLog,utm] should not include ',' in it
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:325)
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:307)
        at org.elasticsearch.index.mapper.MapperService.documentMapperWithAutoCreate(MapperService.java:482)
        at org.elasticsearch.index.shard.IndexShard.prepareCreate(IndexShard.java:464)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:418)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:148)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase.performOnPrimary(TransportShardReplicationOperationAction.java:574)
        at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$PrimaryPhase$1.doRun(TransportShardReplicationOperationAction.java:440)
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:36)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:748)
I agree the date/time might be part of the problem, but I can't determine where the problem resides.

However, I do not know what to do to resolve the date/time issue.

Please let me know what other information you might need.

Thank you.

Rodney.
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: FortiOS 5.6 GrokParseFailure

Post by mcapra »

I think the Elasticsearch error is showing up because you are overwriting the type field in your grok patterns. The behavior of type can be a bit weird on Elasticsearch 1.x. Try altering your grok pattern to change this:

Code: Select all

type=\"%{WORD:type}\"
To something like this:

Code: Select all

type=\"%{WORD:event_type}\"
As far as the grokparsefailure goes, I'd debug it by matching %{GREEDYDATA} on the beginning or end of the pattern gradually eliminating fields in the pattern until the grokparsefailure tag goes away. Then you can at least isolate the problem to a specific match/field.

One question: Is there a particular reason you're using grok where, as far as I can tell, a kv filter will definitely work more efficiently? You'd still have to leverage something like the prefix directive to work around the type issue, but I think it'd be much more elegant.

In my mind, you could strip out the <pri> as a discrete step using mutate, then apply a kv filter to the rest of the message. I can lab it up later tonight if a solution isn't found by then.
Former Nagios employee
https://www.mcapra.com/
dwhitfield
Former Nagios Staff
Posts: 4583
Joined: Wed Sep 21, 2016 10:29 am
Location: NoLo, Minneapolis, MN
Contact:

Re: FortiOS 5.6 GrokParseFailure

Post by dwhitfield »

Thanks again @mcapra!

OP, I am not sure when these are going to open back up, but once they do, I would highly encourage a quickstart: https://www.nagios.com/services/quickstart/nagios-xi/

I cannot say for certain, but it may be worth emailing sales@nagios.com and giving them some info about the size of the cluster you are thinking about setting up. Also, if you are a current customer, you should be using the customer forums: https://support.nagios.com/forum/viewforum.php?f=38
oliverwjones
Posts: 5
Joined: Fri Jul 21, 2017 2:02 pm

Re: FortiOS 5.6 GrokParseFailure

Post by oliverwjones »

dwhitfield and mcapra,

Thank you very much for the assistance. I apologize for not responding sooner.

A tried mcapra's suggestions with regards to the event_type field, but they didn't work. For some reason some messages parsed with two values separated by a comma causing Logstash errors.

Code: Select all

\"syslog_pri\"=>\"<189>\", \"syslog5424_pri\"=>\"189\", \"<189>date\"=>\"2017-08-11\", \"time\"=>[\"01:00:36\", \"01:00:36\"], \"devname\"=>[\"FWF60E4Q16017843\", \"FWF60E4Q16017843\"], \"devid\"=>[\"FWF60E4Q16017843\", \"FWF60E4Q16017843\"], \"logid\"=>[\"0001000014\", \"0001000014\"], \"subtype\"=>[\"local\", \"local\"], \"level\"=>[\"notice\", \"notice\"]

 \"syslog_pri\"=>\"<189>\", \"syslog5424_pri\"=>\"189\", \"<189>date\"=>\"2017-08-11\", \"time\"=>[\"01:00:30\", \"01:00:30\"], \"devname\"=>[\"FWF60E4Q16017843\", \"FWF60E4Q16017843\"], \"devid\"=>[\"FWF60E4Q16017843\", \"FWF60E4Q16017843\"], \"logid\"=>[\"0001000014\", \"0001000014\"], \"subtype\"=>[\"local\", \"local\"], \"level\"=>[\"notice\", \"notice\"], \"vd\"=>[\"root\", \"root\"], 

 \"syslog_pri\"=>\"<190>\", \"syslog5424_pri\"=>\"190\", \"<190>date\"=>\"2017-08-11\", \"time\"=>[\"01:00:28\", \"01:00:28\"], \"devname\"=>[\"FWF60E4Q16017843\", \"FWF60E4Q16017843\"], \"devid\"=>[\"FWF60E4Q16017843\", \"FWF60E4Q16017843\"], \"logid\"=>[\"0201009233\", \"0201009233\"], \"subtype\"=>[\"virus\", \"virus\"], \"eventtype\"=>[\"analytics\", \"analytics\"], \"level\"=>[\"information\", \"information\"], \
Since multiple values were showing up with duplicate values, I took mcapra's advice and moved to use kv for the majority of the parse.

(You asked why I am using GROK versus KV. It is because I am new to this task and most of the examples I could find use GROK. Very few examples use KV.)


Here is my current FortiOS input filter:

Code: Select all

if [type] == 'FortiLog' {
  grok {
         match => [ 'message' , '\<%{DATA:pri_date_garbage} time=%{TIME:gdtime} devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=%{INT:logid} type=%{WORD:maintype} subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" %{GREEDYDATA:message}' ]
         tag_on_failure => [ 'failure_grok_fortiOS' ]
	}
        kv {
            source => 'message'
            }
            geoip {
                   source => 'srcip'
                   }
                         }
While using GROK to parse the first nine values, I still have an issue with the syslog PRI and date values. If you look at the values above you will see the PRI is picked up by two internal variables twice (syslog5424_pri and syslog_pri) correctly. It is also getting picked up along with the nonstandard date to create a problem - "<189>date\"=>\"2017-08-11\",. I have spent hours trying to find a way to ignore these values, delete these values, create a tag, or, just, in general, fix the issue. But, as I pointed out above, this is my first time getting this deep into NLS, Logstash, and Eleasticsearch.

Three things I know:
  1. While syslog5424_pri and syslog_pri are picking up the PRI value neither value shows up in the NLS FortiOS syslog output. I don't know if I should call one of these values out via GROK or MUTATE and, if so, how.
  2. Thus far I have failed miserably at resolving this issue
  3. The Logstash and Elasticsearch log files do not provide any output about this issue

The Logstash logfile does have some recurring Java errors, but I have not found a reason for them much less a solution for them.

Code: Select all

{:timestamp=>"2017-08-11T16:59:17.005000-0500", :message=>"An error occurred. Closing connection", :client=>"0:0:0:0:0:0:0:1:53796", :exception=>#<LogStash::ShutdownSignal: LogStash::ShutdownSignal>, :backtrace=>["org/jruby/RubyIO.java:2996:in `sysread'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.5/lib/logstash/inputs/tcp.rb:164:in `read'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.5/lib/logstash/inputs/tcp.rb:112:in `handle_socket'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.5/lib/logstash/inputs/tcp.rb:147:in `client_thread'"], :level=>:error}

{:timestamp=>"2017-08-11T16:59:17.006000-0500", :message=>"An error occurred. Closing connection", :client=>"10.50.50.155:58310", :exception=>#<LogStash::ShutdownSignal: LogStash::ShutdownSignal>, :backtrace=>["org/jruby/RubyIO.java:2996:in `sysread'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.5/lib/logstash/inputs/tcp.rb:164:in `read'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.5/lib/logstash/inputs/tcp.rb:112:in `handle_socket'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-tcp-0.1.5/lib/logstash/inputs/tcp.rb:147:in `client_thread'"], :level=>:error}
Here are two pictures of two of the log messages as they are displayed in NLS' web interface:
https://mega.nz/#!31hwwRJZ!3606B3r48MhQ ... 7YZDe0sWD4

https://mega.nz/#!yxBVhIJS!6SNRdrNJl3AR ... qiasyT6kRk

At this point in time I would be good with removing the PRI, non-standard date, and time. The PRI is not important. The "@timestamp" is good for what I am doing.

If you have any questions or requests for information please let me know.

Thank you.

Rodney.

FYI. Here are a couple good sites I have referenced while troubleshooting:
https://www.ddreier.com/logstash-config ... issection/ Daniel Dreier does a great job of explaining commands and results.
scottwilkerson
DevOps Engineer
Posts: 19396
Joined: Tue Nov 15, 2011 3:11 pm
Location: Nagios Enterprises
Contact:

Re: FortiOS 5.6 GrokParseFailure

Post by scottwilkerson »

oliverwjones wrote:ere is my current FortiOS input filter:

Code: Select all

    if [type] == 'FortiLog' {
      grok {
             match => [ 'message' , '\<%{DATA:pri_date_garbage} time=%{TIME:gdtime} devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=%{INT:logid} type=%{WORD:maintype} subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" %{GREEDYDATA:message}' ]
             tag_on_failure => [ 'failure_grok_fortiOS' ]
       }
            kv {
                source => 'message'
                }
                geoip {
                       source => 'srcip'
                       }
                             }
I think you need the closing > of the PRI in your filter right?

Code: Select all

    if [type] == 'FortiLog' {
      grok {
             match => [ 'message' , '\<%{DATA:pri_date_garbage}\> time=%{TIME:gdtime} devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=%{INT:logid} type=%{WORD:maintype} subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" %{GREEDYDATA:message}' ]
             tag_on_failure => [ 'failure_grok_fortiOS' ]
       }
            kv {
                source => 'message'
                }
                geoip {
                       source => 'srcip'
                       }
                             }
If you could send a current log line I'd be willing to try testing it for you on
https://grokdebug.herokuapp.com/
Former Nagios employee
Creator:
ahumandesign.com
enneagrams.com
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: FortiOS 5.6 GrokParseFailure

Post by mcapra »

Some thoughts:
oliverwjones wrote: While syslog5424_pri and syslog_pri are picking up the PRI value neither value shows up in the NLS FortiOS syslog output. I don't know if I should call one of these values out via GROK or MUTATE and, if so, how.
It's not enough to simply grok match the pri. You need to strip this value from the message before you pass it off to the kv filter, otherwise you might see odd behavior. Here's how you might do that with a mutate step:

Code: Select all

mutate {
    gsub [
        "message","^<[0-9]*>",""
    ]
}
The gsub elements from left to right represent:
  • The field you want to substitute on
  • The regular expression you wish to match
  • The value you replace regular expression matches with (in the above case, replace it with an empty string)
If the pri is always numeric digits, the above should suffice. If you want to capture the pri, you can do that as a grok step ahead of your mutate step. Afterwards, the message field should be all ready to go for a kv step.

If I were to refactor your current filter rule, I would place a mutate step after the grok and before the kv. I would also alter your grok to only parse out the pri and let the other fields be handled by the kv step.

In pseudocode:

Code: Select all

if [type] == 'FortiOS' {
    grok {
        // extract the pri, only the pri
    }
    mutate {
        // remove the pri from the "message" field
    }
    kv {
        // parse what's left of the message as key-value pairs
    }
}
I'll lab this up tonight if something definitive can't be found by then.
Former Nagios employee
https://www.mcapra.com/
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: FortiOS 5.6 GrokParseFailure

Post by cdienger »

Per mcapra's suggestion I tested the config:

Code: Select all

if [type] == 'FortiLog' {
    grok {
        match => [ 'message' , 'pri=%{DATA:pri_date_garbage} time=%{TIME:gdtime} devname=%{HOSTNAME:devname} devid=%{HOSTNAME:devid} logid=%{INT:logid} type=%{WORD:maintype} subtype=\"%{WORD:subtype}\" level=\"%{WORD:level}\" %{GREEDYDATA:message}' ]
        tag_on_failure => [ 'failure_grok_fortiOS' ]
    }

	mutate {
		gsub => [ 'message','^<[0-9]*>','' ]
	}
	
    kv {
		source => 'message'
    }
    geoip {
        source => 'srcip'
    }
}
and the pri information is indeed dropped. Let us know once you've had a chance to test as well.
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
Locked