Support forum for Nagios Core, Nagios Plugins, NCPA, NRPE, NSCA, NDOUtils and more. Engage with the community of users including those using the open source solutions.
scottwilkerson
DevOps Engineer
Posts: 19396 Joined: Tue Nov 15, 2011 3:11 pm
Location: Nagios Enterprises
Contact:
Post
by scottwilkerson » Thu Sep 28, 2017 11:16 am
You could use something like the following but I have no idea how long this can take on files this size and you have to do a bunch of them starting the the newest large files
Something like this
Code: Select all
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 exclude-these.txt f=2 from-this.txt
so you you you would need to run these in sequence if 09-15-2017 was the largest
Code: Select all
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-14-2017-00.log f=2 nagios-09-15-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-13-2017-00.log f=2 nagios-09-14-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-12-2017-00.log f=2 nagios-09-13-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-11-2017-00.log f=2 nagios-09-12-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-10-2017-00.log f=2 nagios-09-11-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-09-2017-00.log f=2 nagios-09-10-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-08-2017-00.log f=2 nagios-09-09-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-07-2017-00.log f=2 nagios-09-08-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-06-2017-00.log f=2 nagios-09-07-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-05-2017-00.log f=2 nagios-09-06-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-04-2017-00.log f=2 nagios-09-05-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-03-2017-00.log f=2 nagios-09-04-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-02-2017-00.log f=2 nagios-09-03-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-01-2017-00.log f=2 nagios-09-02-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-08-31-2017-00.log f=2 nagios-09-01-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-08-30-2017-00.log f=2 nagios-08-31-2017-00.log
Teja
Posts: 53 Joined: Tue Jun 13, 2017 8:13 am
Post
by Teja » Fri Sep 29, 2017 10:02 am
Thanks for your answer scottwilkerson,Firstly i have tested this with by creating two sample files But it is just listing the lines not deleting them.
scottwilkerson
DevOps Engineer
Posts: 19396 Joined: Tue Nov 15, 2011 3:11 pm
Location: Nagios Enterprises
Contact:
Post
by scottwilkerson » Fri Sep 29, 2017 10:28 am
Here's what I am going to propose we do.
Run the following from that directory and it will make the new files in a directory called newtmp once we are sure they look correct, we cna replace the old ones.
I am being cautious here because we are trying to not lose your historical data.
Code: Select all
mkdir -p newtmp
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-14-2017-00.log f=2 nagios-09-15-2017-00.log > newtmp/nagios-09-15-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-13-2017-00.log f=2 nagios-09-14-2017-00.log > newtmp/nagios-09-14-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-12-2017-00.log f=2 nagios-09-13-2017-00.log > newtmp/nagios-09-13-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-11-2017-00.log f=2 nagios-09-12-2017-00.log > newtmp/nagios-09-12-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-10-2017-00.log f=2 nagios-09-11-2017-00.log > newtmp/nagios-09-11-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-09-2017-00.log f=2 nagios-09-10-2017-00.log > newtmp/nagios-09-10-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-08-2017-00.log f=2 nagios-09-09-2017-00.log > newtmp/nagios-09-09-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-07-2017-00.log f=2 nagios-09-08-2017-00.log > newtmp/nagios-09-08-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-06-2017-00.log f=2 nagios-09-07-2017-00.log > newtmp/nagios-09-07-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-05-2017-00.log f=2 nagios-09-06-2017-00.log > newtmp/nagios-09-06-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-04-2017-00.log f=2 nagios-09-05-2017-00.log > newtmp/nagios-09-05-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-03-2017-00.log f=2 nagios-09-04-2017-00.log > newtmp/nagios-09-04-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-02-2017-00.log f=2 nagios-09-03-2017-00.log > newtmp/nagios-09-03-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-09-01-2017-00.log f=2 nagios-09-02-2017-00.log > newtmp/nagios-09-02-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-08-31-2017-00.log f=2 nagios-09-01-2017-00.log > newtmp/nagios-09-01-2017-00.log
awk '{if (f==1) { r[$0] } else if (! ($0 in r)) { print $0 } } ' f=1 nagios-08-30-2017-00.log f=2 nagios-08-31-2017-00.log > newtmp/nagios-08-31-2017-00.log
Then we can inspect the new directory
Once we are convinced it all looks correct (or after moving the originals somewhere safe we can move this into the correct directory
Teja
Posts: 53 Joined: Tue Jun 13, 2017 8:13 am
Post
by Teja » Thu Oct 05, 2017 9:59 am
Hi scottwilkerson - The awk script was working properly on smaller logs like [upto 3gb] but when i try this on larger logs files like [30-40 gb] its taking time to process and finally getting killed automatically.
dwhitfield
Former Nagios Staff
Posts: 4583 Joined: Wed Sep 21, 2016 10:29 am
Location: NoLo, Minneapolis, MN
Contact:
Post
by dwhitfield » Thu Oct 05, 2017 3:47 pm
Have you considered upgrading? If this is still occurring on the latest version then we might be able to file a bug report.
Also, do you need data from those days? If not, you could just delete them. Of course, be careful with that since once it's gone it's gone.
Teja
Posts: 53 Joined: Tue Jun 13, 2017 8:13 am
Post
by Teja » Fri Oct 06, 2017 4:30 am
No the log rotation is working but i'm running this awk script on the logs on which the Log rotation didn't happen i.e form "AUG 26th - SEP 19th" . Attaching the Screen Shot below ,So here if you see when i run the script on "Aug 30th - Aug 31st" files it's working as expected since the size is upto 8 GB but from "SEP 1st - SEP 19th" its not executing on any any of them ,like taking time to execute and getting Killed finally.
Attachments
scottwilkerson
DevOps Engineer
Posts: 19396 Joined: Tue Nov 15, 2011 3:11 pm
Location: Nagios Enterprises
Contact:
Post
by scottwilkerson » Fri Oct 06, 2017 10:39 am
Teja wrote: Hi scottwilkerson - The awk script was working properly on smaller logs like [upto 3gb] but when i try this on larger logs files like [30-40 gb] its taking time to process and finally getting killed automatically.
Fixing this may only be possible by increasing RAM on the server, or moving the files to a server with enough RAM to handle those files