For testing purposes i'd like to find the best way to insert data as json into NLS with curl. Do I need to specify the index I'm putting it into?
This is the json output of a log that I want to grok, but as the error only comes in every 4-6 hours, it's kind of hard to test. Also because the message field is multiline the usual web grok debugggers don't seem to work..
I'm guessing Iwill have to omit the fields starting with '_'?
Code: Select all
{
"_index": "logstash-2016.10.09",
"_type": "eventlog",
"_id": "AVeoRyRSfSZthwLITZsU",
"_score": null,
"_source": {
"message": "Logon Failure on database \"SG Gebruikers S1 [DB1-29]\" - Windows account DOMAIN\\exch01$; mailbox /o=zdzazdazt/ou=Administratie/cn=Recipients/cn=zadzazd.\r\nError: 1245 \r\nClient Machine: exch01 \r\nClient Process: edgetransport.exe \r\nClient ProcessId: 0 \r\nClient ApplicationId: Client=Hub Transport ",
"@version": "1",
"@timestamp": "2016-10-09T07:10:16.680Z",
"host": "10.54.28.110",
"type": "eventlog",
"category": "Logons",
"channel": "Application",
"eventid": 1022,
"hostname": "exch01",
"keywords": 36028797018963970,
"processid": 0,
"recordnumber": 40930624,
"severity_label": "error",
"severity": 4,
"sourcemodulename": "eventlog",
"sourcename": "MSExchangeIS Mailbox Store",
"task": 16,
"threadid": 0,
"opcode": null,
"logsource": "exch01",
},
"sort": [
1475997016680,
1475997016680
]
}Code: Select all
curl -XPOST "http://localhost:9200/indexname/typename/optionalUniqueId" -d "{ \"field\" : \"value\"}"Fyi, this is the grok filter I have now (for the message field):
Code: Select all
if [sourcename] == "MSExchangeIS Mailbox Store" {
grok {
match => [ "message", "(?m)%{GREEDYDATA:info1}Error: %{NUMBER:exchange_error} \nClient Machine: %{HOSTNAME:exchange_client} \nClient Process: %{HOSTNAME:exchange_processname} \nClient ProcessId: %{NUMBER:exchange_processid} \nClient ApplicationId: %{GREEDYDATA:exchange_applicationid}
" ]
}
mutate {
add_tag => "mutated_msechangeis_mailbox_store"
}
}
Willem