defragment causes system hang on 4.0e

From: Bevan Broun <bevanb_at_ee.uwa.edu.au>
Date: Wed, 14 Apr 1999 11:07:24 +0800

Hi
I have been trying to run defragment on an advfs that is on one 20GB
partition of a raid. The output of showfdmn and showfsets is appended
to this mail. I am running the command:

/usr/sbin/defragment -v ee

via a cronjob that starts at 2am. The system hangs about 7 hours
later! I know that defragmentation does occur 'cause I run defragment
-nv later and see the stats improve (see output). The only messages I
can find are:

Apr 14 08:39:56 fs1 snmpd[494]: Closing subagent svrMgt_mib, reason: 0
Apr 14 08:46:09 fs1 snmpd[494]: Sendto failed: 2
Apr 14 08:46:09 fs1 snmpd[494]: Closing subagent advfsd, reason: 0

These messages start occuring just after the defragment starts -
nothing special just before the hang. I suspect that the defragment is
trying to do something just as it finishes and this is causing the
system hang.

HELP

BB
-- 
Bevan Broun                                           ph (08) 9380 1587
Computer Systems Officer                             fax (08) 9380 1065
Dept. Electrical and Electronic Engineering      
University of Western Australia                                 rm. G70
======COMMAND OUTPUT=================================
root_at_fs1>cat /tmp/defragment-nv.out
defragment: Gathering data for domain 'ee'
  Current domain data:
    Extents:                 221072
    Files w/extents:         170680
    Avg exts per file w/exts:  1.30
    Aggregate I/O perf:         92%
    Free space fragments:     86073
                     <100K     <1M    <10M    >10M
      Free space:     45%     39%      0%     16%
      Fragments:    75677   10392       0       4
=====================================================
root_at_fs1>showfdmn ee
               Id              Date Created  LogPgs  Domain Name
35b13f8b.00092f6a  Sun Jul 19 08:36:27 1998     512  ee
  Vol   512-Blks        Free  % Used  Cmode  Rblks  Wblks  Vol Name
   1L   41891840     8864000     79%     on    128    128  /dev/re1c
=====================================================
root_at_fs1>showfsets ee
localhost
        Id           : 35b13f8b.00092f6a.1.8001
        Files        :    23324,  SLim=        0,  HLim=        0
        Blocks (512) :  2223956,  SLim=        0,  HLim=        0
        Quota Status : user=off group=off
backup
        Id           : 35b13f8b.00092f6a.2.8001
        Files        :       21,  SLim=        0,  HLim=        0
        Blocks (512) :      294,  SLim=        0,  HLim=        0
        Quota Status : user=off group=off
staff
        Id           : 35b13f8b.00092f6a.3.8001
        Files        :    41251,  SLim=        0,  HLim=        0
        Blocks (512) :  2065736,  SLim=        0,  HLim=        0
        Quota Status : user=off group=off
ugrad
        Id           : 35b13f8b.00092f6a.4.8001
        Files        :   446794,  SLim=        0,  HLim=        0
        Blocks (512) : 20954472,  SLim= 24000000,  HLim= 25000000
        Quota Status : user=off group=off
pgrad
        Id           : 35b13f8b.00092f6a.5.8001
        Files        :    46327,  SLim=        0,  HLim=        0
        Blocks (512) :  2615736,  SLim=        0,  HLim=        0
        Quota Status : user=off group=off
guest
        Id           : 35b13f8b.00092f6a.6.8001
        Files        :    27546,  SLim=        0,  HLim=        0
        Blocks (512) :  1863122,  SLim=        0,  HLim=        0
        Quota Status : user=off group=off
units
        Id           : 35b13f8b.00092f6a.7.8001
        Files        :    16807,  SLim=        0,  HLim=        0
        Blocks (512) :   850502,  SLim=        0,  HLim=        0
        Quota Status : user=off group=off
mail
        Id           : 35b13f8b.00092f6a.8.8001
        Files        :     1481,  SLim=        0,  HLim=        0
        Blocks (512) :   827004,  SLim=        0,  HLim=        0
        Quota Status : user=off group=off
=============================================================
Received on Wed Apr 14 1999 - 03:05:03 NZST

This archive was generated by hypermail 2.4.0 : Wed Nov 08 2023 - 11:53:39 NZDT