Hi again,
I got a request that Spider's code be posted and has graciously
consented. It is currently at:
http://users.rcn.com/spiderb/sec/audreport.ksh.txt
I am also including the text below, before the original summary and the
original query.
--Joe
#!/usr/bin/ksh -ph
# Script to return summary of login/logout activities on the system since
# the last time it was run.
# Leaves the last-mailed message in 'lastmsg', a timestamp for tracking
# its last run in 'lasttime', and binary auditlog files in
# 'tevents.DATE' files. A crontab entry to clean them out is recommended.
# Expected customization starts with the Bdir= assignment and ends at
# the Modems= assignment.
# By default, it expects to use /var/adm/local as its private directory,
# that there exists a 'secadmin' alias for receiving its reports, and
# that there's a modem on the machine with device major,minor of '24,2'
# (as on my "Pelican" [3000/300]). Multiple modems, events, and email
# targets are possible. The modems should be "major,minor" pairs, space-
# separated. The events should be properly-formatted "-e event" options
# for audit_tool. The email targets should be comma-separated. (Yes,
# I know that's not what the manpage shows for mailx, but it's what
# works best with mailx and sendmail.)
# I run this script out of root's crontab every morning so that my day
# starts off with a summary of the preceding 24 hours' events.
# This script takes options:
# -c cat the report instead of emailing it
# -i intermediate run--don't update lasttime and don't
# clobber 'tevents.DATE' (creates 'tevents.DATE.tmp')
# and creates 'lastmsg.tmp' rather than 'lastmsg'
# Any other arguments are errors.
export PATH=/usr/sbin:/usr/bin:/usr/ccs/bin:/sbin
export TMPDIR=/var/tmp
# where this script should run
Bdir=/var/adm/local
# where to find audit log files
Adir=/var/audit
Ofile="${Bdir}/lasttime"
Nfile="${Bdir}/newtime"
Afile="${Bdir}/tevents"
Tfile="${Bdir}/lastmsg"
Mfile="${Bdir}/lastmdm"
Whom="secadmin"
#Format="-B"
Format="-O time,userid,pid,res,event"
# Use of a site-defined event alias is recommended here if the list might
# be long otherwise.
Events="-e trusted_event"
#Modems=""
Modems="24,2"
umask 077
# init options
doemail=1
dolasttime=1
dousage=0
Ttmp=
while getopts "ci" opt
do
case "$opt" in
c)
doemail=0
;;
i)
dolasttime=0
Ttmp=.tmp
;;
*)
dousage=1
;;
esac
done
shift OPTIND-1
if [ $# -gt 0 ]
then
dousage=1
fi
if [ $dousage -ne 0 ]
then
print -u2 "Usage: $0 [-c] [-i]"
return 2
fi
# ensure the output format we need from date.
export LANG=C LC_ALL=C
export TZ=:UTC
if [ ! -f "${Ofile}" ]
then
print 700101000001 > "${Ofile}"
touch -t 197001010000.01 "${Ofile}"
fi
date +%y%m%d%H%M%S > "${Nfile}"
curfile=$(auditd -q)
if [ -f "$curfile" ] && ( [ -f "$curfile".Z ] || [ -f "$curfile".gz ] )
then
nowait=1
else
nowait=0
fi
auditd -dx
if [ $nowait -eq 0 ]
then
sleep 20 # give some time for the compress of the old log to finish
while [ -f "$curfile" -a -f "$curfile".Z ] || [ -f "$curfile" -a -f
"$curfile".gz ]
do
sleep 2 # wait some more
done
fi
: > "${Afile}"
# V5.0's audit_tool will sort between logfiles if given multiple files.
# This is for TruCluster support. So, we take advantage of it if on V5.x.
case $(uname -r) in
?[1-4]*|[1-4]*)
for af in $(find "$Adir" -name "auditlog.*" -newer "${Ofile}" -print
| sort)
do
audit_tool -S -b -t $(<"${Ofile}") -T $(<"${Nfile}") -o -Q \
$Events \
"${af}" 2>/dev/null >>"${Afile}"
# the suppressed errors are for the {un,}compressed messages
done
;;
*)
audit_tool -S -b -t $(<"${Ofile}") -T $(<"${Nfile}") -o -Q \
$Events 2>/dev/null >>"${Afile}" \
$(find "$Adir" -name "auditlog.*" -newer "${Ofile}" -print |
sort)
# the suppressed errors are for the {un,}compressed messages
;;
esac
TZ=:localtime
Tfile="${Tfile}${Ttmp}"
if [ -s "${Afile}" ]
then
audit_tool $Format -Q "${Afile}" > "${Tfile}"
if [ -s "${Tfile}" ]
then
case "$Modems" in
"")
:
;;
*)
xmodems=
for modem in $Modems
do
xmodems="$xmodems -x $modem"
done
audit_tool $Format -Q $xmodems "${Afile}" \
> "${Mfile}" 2>/dev/null
if [ -s "${Mfile}" ]
then
print "\nModem line specific summary:\n" \
>>"${Tfile}"
cat "${Mfile}" >>"${Tfile}"
fi
rm -f "${Mfile}"
;;
esac
if [ $doemail -eq 1 ]
then
Mail -s 'login/out audit summary' ${Whom} < "${Tfile}"
else
print "login/out audit summary\n"
cat "${Tfile}"
print ""
fi
fi
fi
if [ $dolasttime -eq 1 ]
then
mv -f "${Nfile}" "${Ofile}"
mv -f "${Afile}" "${Afile}".$(date +%Y.%02m.%02d)
else
rm -f "${Nfile}"
mv -f "${Afile}" "${Afile}".$(date +%Y.%02m.%02d).tmp
fi
> -----Original Message-----
> From: Senulis, Joseph A
> Sent: Friday, January 31, 2003 3:18 PM
> To: 'tru64-unix-managers_at_ornl.gov'
> Subject: SUMMARY: audit log file maintenance
>
> Hi,
> I only got one response on this, from Spider.Boardman. Although he
> was addressing a slightly different issue, his code suggested some
> clarifications in my thinking. In any event, we will not use
> auditlogtrim. Instead, we will be rolling over the audit log file weekly
> and deleting logs older than several weeks, relying on our backups to
> restore those older logs when needed.
> --Joe
>
> -----Original Message-----
> From: Senulis, Joseph A
> Sent: Monday, January 27, 2003 9:48 AM
> To: 'tru64-unix-managers_at_ornl.gov'
> Subject: audit log file maintenance
>
> Hi,
> What is the recommended method for maintaining the logs in
> /var/audit. I didn't see anything in the archives and the documentation
> is less than helpful.
>
> When the audit system is configured, a cron job runs
> /usr/lbin/auditlogtrim, every other month in our case. However, it
> doesn't seem to do much except roll over the log file, use up a lot of CPU
> and generate extra files. On some systems, it may run for a couple of
> days. Additionally, I have files that are more than two months old that
> never get deleted. (Aside: auditlogtrim contains code to delete old
> files, but the loop that supposedly removes files, which starts:
>
> AUDIT_TOOL="/usr/sbin/audit_tool"
> . . . . .
> FILES_TO_RM=$($AUDIT_TOOL -j $LAST_KEPT_EVENT_DATE $LOG_FILE) 2>>/dev/null
>
> don't seem to do anything. I do note that a man audit_tool does not list
> -j as being a valid option.)
>
> Rather than continue to trace the code, I was wondering if there was
> a better way to do things. My current thinking is to just roll over the
> log file periodically, perhaps weekly or monthly, and then just delete
> files older than a reasonable number of months. It would mean that I
> would have to work with multiple audit files when reviewing them, but that
> seems to be the case anyway.
>
> Any suggestions? I will summarize, although it may be a bit while I
> try out the suggestions. Thanks.
> --Joe
Received on Mon Feb 03 2003 - 17:28:39 NZDT