Page 4 of 5

Re: Check_HTTP query

Posted: Tue Aug 02, 2016 2:34 pm
by neworderfac33
OK - hands up - I'm guilty, so kudos to you for spotting my error!
I'm at home now (20.33) so I will give it another go in the morning.
Thank you for your continuing patience! :-)

Pete

Re: Check_HTTP query

Posted: Tue Aug 02, 2016 3:29 pm
by mcapra
Let us know if there's any updates!

Re: Check_HTTP query

Posted: Wed Aug 03, 2016 7:14 am
by neworderfac33
Good afternoon,

I can now get the following commands from the prompt to return the following results:

./check_http -H MyURL -u /my/url/name/ (WITH a trailing slash)
HTTP OK: HTTP/1.1 200 OK - 374 bytes in 0.020 second response time |time=0.020087s;;;0.000000 size=374B;;;0

./check_http -H MyURL -u /my/url/name (WITHOUT a trailing slash)
HTTP OK: HTTP/1.1 301 Moved Permanently - 445 bytes in 0.004 second response time |time=0.004098s;;;0.000000 size=445B;;;0

But when i try to build these into my services.cfg file, both return:
HTTP WARNING: HTTP/1.1 400 Bad Request - 556 bytes in 0.004 second response time

Code: Select all

define command{
        command_name    check_http_url
        command_line    $USER1$/check_http -H $HOSTADDRESS$ -u $ARG1$
}

define host{
    host_name                    URL1
    hostgroups                   000-URL1
    address                      MYURL
    max_check_attempts           5
    check_period                 24x7
    contact_groups               admins
    notification_period          24x7
    }

define service{
    use                          generic-service
    host_name                    URL1
    service_description          -
    check_command                check_http_url!URL1/my/url/name
    }
I've tried every permutation of preceding URL1 with a forward slash or not and adding or removing a trailing slash to/from /my/url/name, but nothing seems to work.

It appears you can do it from the command prompt, but not via Nagios config files. I'm stumped :-(

Pete

Re: Check_HTTP query

Posted: Wed Aug 03, 2016 10:24 am
by rkennedy
Your check_command should be check_command check_http_url!/my/url/name/ since your only passing $ARG1$ to -u. The $HOSTADDRESS$ variable will be populated with address MYURL

Re: Check_HTTP query

Posted: Wed Aug 03, 2016 11:01 am
by neworderfac33
By Jove, Sir (or maybe Madame - mustn't be presumptious?), I think you've got it!

That was it - after days of headbanging, once I removed "URL1" from the check_command, I now have an OK response in the web interface!

Now I can get back to checking for text and the response code. I can go home tonight a very happy man - thank you VERY much indeed.

Pete

Re: Check_HTTP query

Posted: Wed Aug 03, 2016 12:19 pm
by rkennedy
Score!

No problem, are we good to mark this one as resolved?

Re: Check_HTTP query

Posted: Thu Aug 04, 2016 4:28 am
by neworderfac33
Just one last related question - I'm now back to searching for a particular string thus:

Code: Select all

define service{
    use                          generic-service
    service_description          
    check_command                check_http_url!/my/url/path/ --expect=301 -w "Clear settings Cache"
    host_name                    URL1
    }
If I go to the URL, I can clearly see the text there, but the web interface returns:
HTTP CRITICAL: Status line output matched "200" - string 'Clear s Cache' not found on 'MyURL/my/utl/path/' - 374 bytes in 0.012 second response time

is there any type of text that might not be detected by "-s"?

Cheers

Re: Check_HTTP query

Posted: Thu Aug 04, 2016 9:36 am
by rkennedy
-w, --warning=DOUBLE
Response time to result in warning status (seconds)
Can you post your service definition once again? It doesn't look proper, and I'm unsure how it's working since you're not passing -s / -r at all. It appears your passing your string matching in the -w field, which won't work.

We need to see the exact definition in order to be able to help you.

Re: Check_HTTP query

Posted: Thu Aug 04, 2016 11:00 am
by neworderfac33
That was a copy/paste type - I really don't help myself, do I? :-(

Code: Select all

./check_http -H MyURL -u /my/url/path -s "This document may be found" -v
returns:

Code: Select all

GET /members/lp/system HTTP/1.1
User-Agent: check_http/v2.0 (nagios-plugins 2.0)
Connection: close
Host: MyURL

MyURL:80/my/url/path is 445 characters
STATUS: HTTP/1.1 301 Moved Permanently
**** HEADER ****
Content-Type: text/html; charset=UTF-8
Location: http://MyURL/my/url/path/
Date: Thu, 04 Aug 2016 15:44:21 GMT
Content-Length: 171
NS-LB-VSERVER: lb=010-MDN-Members:DotNET-8003#DEV
NS-ORIGIN-SERVER: 99.99.99.99
**** CONTENT ****
<head><title>Document Moved</title></head>
<body><h1>Object Moved</h1>This document may be found <a HREF="http://MyURL/my/url/path/">here</a></body>
returns

Code: Select all

HTTP OK: HTTP/1.1 301 Moved Permanently - 445 bytes in 0.004 second response time |time=0.003540s;;;0.000000 size=445B;;;0

Code: Select all

define service{
    use                          generic-service
    service_description          
    check_command                check_http_url!/my/url/path/ -s "[color=#FF0000]This document may be found[/color]" --expect=200
    host_name                    URL1
    }
returns:

Code: Select all

[color=#FF0000]HTTP CRITICAL: Status line output matched "200" - string 'This document may be found' not found on 'http://MyURL:80/my/url/path/' - 374 bytes in 0.011 second response time[/color] 
Cheers

Pete

Re: Check_HTTP query

Posted: Thu Aug 04, 2016 3:32 pm
by rkennedy
Is $HOSTADDRESS$ set to a hostname or IP? It seems like you're still getting a mixed result, and I suspect it's down to your variables. The reason, is because your nagios check is matching a 200 code, where your CLI check is returning a 301. I would take a look at how everything is parsing together.

I did testing on my end, and it's working as expected.

Code: Select all

[root@localhost libexec]# ./check_http -H google.com
HTTP OK: HTTP/1.1 301 Moved Permanently - 559 bytes in 0.114 second response time |time=0.113783s;;;0.000000 size=559B;;;0

[root@localhost libexec]# curl google.com
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="http://www.google.com/">here</A>.
</BODY></HTML>
Now, I'll try to match the 301 with a 200.

Code: Select all

[root@localhost libexec]# ./check_http -H google.com --expect=200
HTTP CRITICAL - Invalid HTTP response received from host: HTTP/1.1 301 Moved Permanently
and also try matching the string on the page -

Code: Select all

[root@localhost libexec]# ./check_http -H google.com --expect=200 -s "The document has moved"
HTTP CRITICAL - Invalid HTTP response received from host: HTTP/1.1 301 Moved Permanently
Neither work, because we're not getting the 200 error code. At this point the only working match is a 301.

Code: Select all

[root@localhost libexec]# ./check_http -H google.com --expect=301 -s "The document has moved"
HTTP OK: Status line output matched "301" - 559 bytes in 0.066 second response time |time=0.066209s;;;0.000000 size=559B;;;0
This is what leads me to believe, that you're pulling up multiple web pages, as your check over nagios is returning a 200, but CLI is returning a 301.