Page 1 of 1
Is there an import filter to ingest csv files.
Posted: Wed Aug 24, 2016 7:40 am
by Linuxlogger
I was wondering if there is an import filter that I can use as an example to import csv files into NLS.
Re: Is there an import filter to ingest csv files.
Posted: Wed Aug 24, 2016 9:41 am
by Linuxlogger
OK so I found this link:
https://www.elastic.co/guide/en/logstas ... s-csv.html
Went into NLS and created the filter:
CSV import filter
filter {
csv {
convert => { "TIMESTAMP" => "date_time", "CATEGORY" => "string", "EVENT" => "string", "USERID" => "string", "ROWSMODIFIED" => "int", "ROWSRETURNED" => "int", "INSTNAME" => "string", "HOSTNAME" => "string" }
}
}
}
My question is "Now that I have the filter created how do I emplement it for the forwarded csv files? How do I define if TAG = "xx" use CSV import filter.
Re: Is there an import filter to ingest csv files.
Posted: Wed Aug 24, 2016 10:11 am
by mcapra
The filters usually contain some sort of conditional logic when addressing log types. For example, say I have the following input type for all my Linux audit logs:
Code: Select all
tcp {
port => 5545
type => audit_log
}
udp {
port => 5545
type => audit_log
}
In that Input rule I am setting the type to
audit_log for all traffic on port 5545. This means logs sent over that port will have their type set to
audit_log. The corresponding filter that breaks up that audit log into the fields I want to analyze looks like this:
Code: Select all
if [type] == 'audit_log' {
do my filtering
}
If the file exists logically in Linux, either as a path or in a mount point, you can use the file type to define an Input rule for that file:
Code: Select all
input {
file {
path => "/home/user/myfile.csv"
start_position => beginning
type => my_csv_type_tag_or_something
}
}
At which point you could define a filter to work with that type:
Code: Select all
if [type] == 'my_csv_type_tag_or_something' {
do csv things!
}
Re: Is there an import filter to ingest csv files.
Posted: Wed Aug 24, 2016 12:29 pm
by Linuxlogger
Can I use the tag to identify the stream? If so how would I do that.
Re: Is there an import filter to ingest csv files.
Posted: Wed Aug 24, 2016 1:24 pm
by mcapra
Assuming there is a common value in the tags, you should be able to do that:
Code: Select all
if "value" in [tags] {
do value things
}
Re: Is there an import filter to ingest csv files.
Posted: Thu Aug 25, 2016 11:47 am
by Linuxlogger
So...
This should work then?
# DB2 import filter
filter {
if "DB2" in [tags] {
csv {
convert => { "TIMESTAMP" => "date_time", "CATEGORY" => "string", "EVENT" => "string", "USERID" => "string", "ROWSMODIFIED" => "int", "ROWSRETURNED" => "int", "INSTNAME" => "string", "HOSTNAME" => "string" }
}
}
}
}
Re: Is there an import filter to ingest csv files.
Posted: Thu Aug 25, 2016 12:40 pm
by mcapra
Not quite. For events in CSV format, it's helpful to define the columns before doing anything else:
Code: Select all
if [type] == 'csv_entry' {
csv {
columns => ["TIMESTAMP", "CATEGORY", "EVENT", "USERID", "ROWSMODIFIED", "ROWSRETURNED", "INSTNAME", "HOSTNAME"]
}
}
That by itself should be enough to capture all the individual fields. Depending on how
TIMESTAMP is formatted, you may need to then mutate that field for it to be recognized as a proper
date_time or
date in elasticsearch. If elasticsearch doesn't recognize the format of
TIMESTAMP, it will probably default it's type to
string. Which will still make the field itself searchable, just not as a proper
date or
date_time.
The resulting entry from the above filter:
2016_08_25_12_38_43_Dashboard_Nagios_Log_Server.png
My
TIMESTAMP is poorly formatted, so it gets processed as a
string by default:
2016_08_25_12_39_48_Dashboard_Nagios_Log_Server.png