Is there an import filter to ingest csv files.

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
Locked
Linuxlogger
Posts: 32
Joined: Thu Jun 23, 2016 4:33 pm

Is there an import filter to ingest csv files.

Post by Linuxlogger »

I was wondering if there is an import filter that I can use as an example to import csv files into NLS.
Linuxlogger
Posts: 32
Joined: Thu Jun 23, 2016 4:33 pm

Re: Is there an import filter to ingest csv files.

Post by Linuxlogger »

OK so I found this link:
https://www.elastic.co/guide/en/logstas ... s-csv.html
Went into NLS and created the filter:

CSV import filter

filter {
csv {
convert => { "TIMESTAMP" => "date_time", "CATEGORY" => "string", "EVENT" => "string", "USERID" => "string", "ROWSMODIFIED" => "int", "ROWSRETURNED" => "int", "INSTNAME" => "string", "HOSTNAME" => "string" }
}
}
}

My question is "Now that I have the filter created how do I emplement it for the forwarded csv files? How do I define if TAG = "xx" use CSV import filter.
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: Is there an import filter to ingest csv files.

Post by mcapra »

The filters usually contain some sort of conditional logic when addressing log types. For example, say I have the following input type for all my Linux audit logs:

Code: Select all

tcp {
    port => 5545
    type => audit_log
  }
  udp {
    port => 5545
    type => audit_log
  }
In that Input rule I am setting the type to audit_log for all traffic on port 5545. This means logs sent over that port will have their type set to audit_log. The corresponding filter that breaks up that audit log into the fields I want to analyze looks like this:

Code: Select all

if [type] == 'audit_log' {
    do my filtering
}
If the file exists logically in Linux, either as a path or in a mount point, you can use the file type to define an Input rule for that file:

Code: Select all

input {
    file {
        path => "/home/user/myfile.csv"
        start_position => beginning
        type => my_csv_type_tag_or_something
    }
}
At which point you could define a filter to work with that type:

Code: Select all

if [type] == 'my_csv_type_tag_or_something' {
    do csv things!
}
Former Nagios employee
https://www.mcapra.com/
Linuxlogger
Posts: 32
Joined: Thu Jun 23, 2016 4:33 pm

Re: Is there an import filter to ingest csv files.

Post by Linuxlogger »

Can I use the tag to identify the stream? If so how would I do that.
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: Is there an import filter to ingest csv files.

Post by mcapra »

Assuming there is a common value in the tags, you should be able to do that:

Code: Select all

if "value" in [tags] {
     do value things
}
Former Nagios employee
https://www.mcapra.com/
Linuxlogger
Posts: 32
Joined: Thu Jun 23, 2016 4:33 pm

Re: Is there an import filter to ingest csv files.

Post by Linuxlogger »

So...

This should work then?

# DB2 import filter

filter {
if "DB2" in [tags] {
csv {
convert => { "TIMESTAMP" => "date_time", "CATEGORY" => "string", "EVENT" => "string", "USERID" => "string", "ROWSMODIFIED" => "int", "ROWSRETURNED" => "int", "INSTNAME" => "string", "HOSTNAME" => "string" }
}
}
}
}
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: Is there an import filter to ingest csv files.

Post by mcapra »

Not quite. For events in CSV format, it's helpful to define the columns before doing anything else:

Code: Select all

if [type] == 'csv_entry' {
csv { 
columns => ["TIMESTAMP", "CATEGORY", "EVENT", "USERID", "ROWSMODIFIED", "ROWSRETURNED", "INSTNAME", "HOSTNAME"]
}
}
That by itself should be enough to capture all the individual fields. Depending on how TIMESTAMP is formatted, you may need to then mutate that field for it to be recognized as a proper date_time or date in elasticsearch. If elasticsearch doesn't recognize the format of TIMESTAMP, it will probably default it's type to string. Which will still make the field itself searchable, just not as a proper date or date_time.

The resulting entry from the above filter:
2016_08_25_12_38_43_Dashboard_Nagios_Log_Server.png
My TIMESTAMP is poorly formatted, so it gets processed as a string by default:
2016_08_25_12_39_48_Dashboard_Nagios_Log_Server.png
You do not have the required permissions to view the files attached to this post.
Former Nagios employee
https://www.mcapra.com/
Locked