Communication error between Oracle and Nagios
Posted: Thu Nov 11, 2021 2:55 pm
I'm trying to adapt some Perl scripts to Bash to run them in Nagios XI.
The complete script, so far, is this one:
Running this script locally, the result is this:
But on the Nagios machine, the result is wrong:
However, if I launch the other option in the script called clusterstatus everything works fine:
The complete script, so far, is this one:
Code: Select all
. /home/oracle/.profile_RAC
ORACLE_HOME=/oracle/app/grid/19300
ORACLE_BASE=/oracle/app/base
nagios_exit_codes=('UNKNOWN', 3, 'OK', 0, 'WARNING', 1, 'CRITICAL', 2)
status='OK'
ok=1
action=$1
case $action in
"votedisk")
#command=`/oracle/app/grid/19300/bin/crsctl query css votedisk | grep asm`
#command=$(/oracle/app/grid/19300/bin/crsctl query css votedisk)
command=`/oracle/app/grid/19300/bin/crsctl query css votedisk`
case $comando in
*"failed"*|*"OFFLINE"*|*"PROC"*)
status='CRITICAL'
output_msg="Voting disk status check failed!"
;;
* )
output_msg="Voting disks status check succeeded"
;;
esac
output="[$status] $output_msg - $command"
;;
"clusterstatus")
comando=`/oracle/app/grid/19300/bin/crsctl query crs releaseversion`
output_msg="All clusterware services are up (clusterware version: $comando)"
output="$output_msg"
;;
esac
echo -e $output
exit 0
Code: Select all
[root@bbddmachine plugins]# sh ./script_prueba.sh votedisk
[OK] Voting disks status check succeeded - ## STATE File Universal Id File Name Disk group -- ----- ----------------- --------- --------- 1. ONLINE 8dfc2a9528244f95bf87bb394e793995 (/dev/mapper/asm_ocr1) [OCR] Located 1 voting disk(s).Code: Select all
[nagios@ng1esp libexec]$ ./check_nrpe -2 -H 172.47.62.12 -t 60 -c check_crs_votedisk
[OK] Voting disks status check succeeded - Unable to communicate with the Cluster Synchronization Services daemon.Code: Select all
[nagios@ng1esp libexec]$ ./check_nrpe -2 -H 172.47.62.12 -t 60 -c check_crs_clusterstatus
All clusterware services are up (clusterware version: Oracle High Availability Services release version on the local node is [19.0.0.0.0])