=======================
General
=======================
Delete large trace file with "ORA-00060 deadlock detected" error.
Due to wrong application setup, deadlocks are coming in database.
Large trc file are being generated, and these fill up disk space to 100%
As a workaround, script was scheduled from crontab, to delete these larhe trace file + write first 1000 lines to a history directory.
=======================
Code Example
=======================
crontab
5,25,45,55 * * * * /software/oracle/oracle/scripts/delete_deadlock_files.sh
bash code
#!/bin/bash
#Delete archive files
LOG_FILE=/software/oracle/oracle/scripts/delete_deadlock_trace.log
WORK_DIR=/software/oracle/diag/rdbms/igt/igt/trace
HIST_DIR=/software/oracle/oracle/scripts/history
touch $LOG_FILE
RUN_DATE=`date +"%Y%m%d_%H%M%S"`
echo "===================================" >> $LOG_FILE
echo "Run Date: $RUN_DATE" >> $LOG_FILE
echo "===================================" >> $LOG_FILE
FILES=`find /software/oracle/diag/rdbms/igt/igt/trace/*.trc | xargs grep -l ORA-00060`
for file in $FILES
do
echo "Found deadlock ORA-00060 in File $file " >> $LOG_FILE
basefile_name=`basename $file`
ls -ltr $file >> $LOG_FILE
echo "deleting File..... $file " >> $LOG_FILE
head -2000 $file > ${HIST_DIR}/${basefile_name}_header.trc
rm -f $file
echo "Done" >> $LOG_FILE
echo >> $LOG_FILE
done
FILES=`find /software/oracle/diag/rdbms/igt/igt/trace/*.trm | xargs grep -l ORA-00060`
for file in $FILES
do
echo "Found deadlock ORA-00060 in File $file " >> $LOG_FILE
ls -ltr $file >> $LOG_FILE
echo "deleting File..... $file " >> $LOG_FILE
rm -f $file
echo "Done" >> $LOG_FILE
echo >> $LOG_FILE
done
No comments:
Post a Comment