Page 1 of 2
Importing from file - assistance please
Posted: Tue Feb 16, 2016 6:49 pm
by tomslmonitor
Hi I'm currently trying to import from a file, I've attempted to follow the steps from the 'source setup' within Nagios but have run into a problem.
Firstly my experience using Linux is pretty much non existent, I've downloaded the shipper.py script and installed netcat.
When attempting to ship a log file as a test I get these errors.
Code: Select all
[root@xxxxxxxx ~]# cat /var/log/httpd/access_log-20160131 | python shipper.py program:apache_access | nc xxx.xxx.xx.xxx 2056
[Errno 32] Broken pipe
Traceback (most recent call last):
File "shipper.py", line 242, in <module>
main()
File "shipper.py", line 237, in main
process_stream(sys.stdin, message)
File "shipper.py", line 217, in process_stream
print json.dumps(message)
IOError: [Errno 32] Broken pipe
Where do I go from here, is this an issue with the code in shipper.py or have I messed up somewhere?
Any help would be great!, thanks.
Re: Importing from file - assistance please
Posted: Wed Feb 17, 2016 10:32 am
by jolson
Where do I go from here, is this an issue with the code in shipper.py or have I messed up somewhere?
Let's take this troubleshooting process in digestible chunks - first, we'll see if you can ship plaintext to Nagios Log Server.
Code: Select all
echo 'test words' | nc xxx.xxx.xx.xxx 2056
If the above arrives properly, you know that you're fine to move forward and attempt to send a file - let me know how the above works out.
Once the log arrives in Nagios Log Server, you will be able to see it in the 'Dashboards' section, you can query for "test words."
Re: Importing from file - assistance please
Posted: Thu Feb 18, 2016 6:45 pm
by tomslmonitor
Thanks Jolson,
Basically troubleshooting with the test words code you gave me, I figured I made a rookie mistake in using the wrong IP (external instead of internal address).
My next question is, as I am importing logs the @timestamp field won't accurately show for the imported logs. @timestamp field will be of when I imported the logs.
What have i tried doing is I've created an addition field called I_timestamp with a grok filter which grabs the timestamp from the log, how do I parse this as a timestamp? currently it is a string and I'd like to be able to graph it.
Grok filter below:
Code: Select all
if [host] =='xxx.xxx.xx.xxx' {
grok {
match => ['message', '%{TIMESTAMP_ISO8601:I_timestamp}']
}
}
timestamp_import.PNG
An example of the beginning of my log lines.
Code: Select all
2016-02-18 11:11:00,150 INFO [RequestDetailValve.logMessages:100 852
Error message on the histogram.
Code: Select all
FacetPhaseExecutionException[Facet [0]: (key) field [I_timestamp] not found]
Could you please guide me on the best way of implementing this.
Thanks!
Re: Importing from file - assistance please
Posted: Fri Feb 19, 2016 11:09 am
by hsmith
Could you break the timestamp down further using additional grok filters to give you the data that you want?
For instance:
Code: Select all
{
"date": [
[
"16-02-18"
]
],
"DATE_US": [
[
null
]
],
"MONTHNUM": [
[
null,
"02"
]
],
"MONTHDAY": [
[
null,
"16"
]
],
"YEAR": [
[
null,
"18"
]
],
"DATE_EU": [
[
"16-02-18"
]
],
"time": [
[
"11:11:00"
]
],
"HOUR": [
[
"11"
]
],
"MINUTE": [
[
"11"
]
],
"SECOND": [
[
"00"
]
]
}
I'm not sure if the
,150 was relevant, but this could be a way to do it.
Re: Importing from file - assistance please
Posted: Sun Feb 21, 2016 6:48 pm
by tomslmonitor
Thanks for the response, I may have been unclear with what I was trying to accomplish.
Will breaking the data down further allow me to graph the data in a histogram?
As it is an import the @timestamp field will all mostly be identical and when graphing in an histogram will not give an accurate representation of when my log messages appear.
The @timestamp field format is : YYYY-MM-DDTHH-mm:ss.SSSZ. So I figured something similar (my I_timestamp) would work but it is parsing as a string instead of what @timestamp is and the histogram is not working
Code: Select all
@timestamp: 2016-02-18T23:31:55.985Z
Code: Select all
I_timestamp: 2016-02-18 11:11:00,150
Re: Importing from file - assistance please
Posted: Mon Feb 22, 2016 1:56 pm
by hsmith
Is this histogram something you're using outside of NLS?
Re: Importing from file - assistance please
Posted: Mon Feb 22, 2016 10:31 pm
by tomslmonitor
No, this is the histogram within Nagios Log Server.
histogram.PNG
The error on the histogram.
histogram_error.PNG
Re: Importing from file - assistance please
Posted: Tue Feb 23, 2016 2:25 pm
by hsmith
Is this a custom dashboard, or one of the default dashboards? I'm trying to do some research on this, and it sounds like there may be some issues with certain types of dashboards. If you're interested,
here is the thread I'm looking at.
Re: Importing from file - assistance please
Posted: Tue Feb 23, 2016 8:10 pm
by tomslmonitor
Thanks for the response.
This is a default dashboard, after reading around some threads I think the issue is that my new field 'im_timestamp' is being parsed as a string and the histogram doesn't like it.
I tried parsing it as a date in the grok filter with the code below:
Code: Select all
grok {
match => ['message', '%{TIMESTAMP_ISO8601:im_timestamp:date}']
}
It is still a string though. Do you any suggestions on this?
im_timestamp(string).PNG
Re: Importing from file - assistance please
Posted: Wed Feb 24, 2016 2:55 pm
by jolson
Yes - you'll have to make use of the 'date' filter, like so:
Code: Select all
date {
match => [ "im_timestamp", "MMM dd YYY HH:mm:ss",
"MMM d YYY HH:mm:ss", "ISO8601" ]
}
Of course you can read more about the date filter here:
https://www.elastic.co/guide/en/logstas ... -date.html