Support forum for Nagios Core, Nagios Plugins, NCPA, NRPE, NSCA, NDOUtils and more. Engage with the community of users including those using the open source solutions.
is there a character limit for the APIs/JSON? We are getting a 500 internal server error if we try to get the JSON for one of our services. The only output in the error log is:
[Mon Jun 02 10:21:28 2014] [error] Premature end of script headers: statusjson.cgi, referer: http://localhost/nagios4/jsonquery.html
the output of the service is huge, 102696 characters, so I'm not sure if the size is the problem or if one of those many characters is confusing things, so I thought I'd start by asking if anyone has run into a character limit?
It may be a limitation of the jsonquery.html helper tool. It you curl the url does it work?
Former Nagios employee
"It is turtles. All. The. Way. Down. . . .and maybe an elephant or two."
VI VI VI - The editor of the Beast!
Come to the Dark Side.
$ curl "http://localhost/nagios4/cgi-bin/statusjson.cgi?query=service&hostname=pleiades1&servicedescription=qstatSum"
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>500 Internal Server Error</title>
</head><body>
<h1>Internal Server Error</h1>
<p>The server encountered an internal error or
misconfiguration and was unable to complete
your request.</p>
<p>Please contact the server administrator,
ess@nas.nasa.gov and inform them of the time the error occurred,
and anything you might have done that may have
caused the error.</p>
<p>More information about this error may be available
in the server error log.</p>
<hr>
<address>Apache/2.2.15 (Red Hat) Server at localhost Port 80</address>
</body></html>
$
I just faked the output of a service to be 102697 0s and then ran the API for that. Both the query generator and curl gave me back 8191 0s. But at least it didn't crash!
Very odd. Might the output have some characters in it that would cause issues with the JSON parser? Or maybe even some malformed binary data that could be messing up the HTTP headers?
I started removing special characters from the output and when I removed the %s from it, the JSON started working! However I only get 8064 bytes of the output. Does anyone else get that limit? Or something similar since I got 8191 bytes when they were 0s.
So 127 bytes/chars less? Anything in that %s that might be significant? Would 127 characters make sense for that %s value?
I am also getting 8191, with the last character being a newline for 8192. Capping the output at 8192 makes sense since this is a compiled binary we are dealing with.
I found 2 instances where 8k buffers were used for strings in the JSON code and have converted them to use dynamically allocated buffers, so the only limitation now is the amount of RAM on the system. This update is in commit 6081644. Check it out and let us know whether it does what you need.
Former Nagios employee
"It is turtles. All. The. Way. Down. . . .and maybe an elephant or two."
VI VI VI - The editor of the Beast!
Come to the Dark Side.