32k limit to a directory?

From: Jerome M Berkman <jerry_at_uclink.berkeley.edu>
Date: Mon, 09 Jul 2001 21:36:11 -0700 (PDT)

Ian Veach wrote that Chris Adams reminded us that there are
32K directory limitations that require workarounds ...
         
We will soon have a directory with about 40,000 entries (for a product
for which we don't have source), so this worries me.

We are currently running DU 4.0F using ADVFS. I just ran a simple
script (see below) to create 75,000 files in a single directory,
and the only problem was speed. It takes roughly twice the system
time to create the 50,000th file as the 25,000th file. I've heard
Tru64 5.1 will solve this. Are there other problems with directories
with large numbers of files?

        - Jerry Berkman, UC Berkeley, jerry_at_uclink.berkeley.edu

#! /usr/local/bin/perl -Tw

# see how many files I can put in a directory and how long it takes

$| = 1;

for( $i = 1; $i <= 100; $i++ ) {
        doit( sprintf( "file%3.3d", $i ) );
}
exit 0;

sub doit {
        my( $init ) = _at__;
        my( $ts_begin, $ts_end, $elapsed, $i );

        $ts_begin = time;
        ($ucpu0,$scpu0,$uc0,$sc0) = times();
        for( $i = 0; $i <= 999; $i++ ) {
                $name = sprintf( "$init.%3.3d", $i );
                open( FILE, "> /var/tmp/big/$name" ) or die "can't write $name";
                print FILE "sample line for $name\n";
                close FILE;
        }
        $ts_end = time;
        ($ucpu1,$scpu1,$uc1,$sc1) = times();
        $ucpu = sprintf( "%3.1f", $ucpu1 - $ucpu0);
        $scpu = sprintf( "%3.1f", $scpu1 - $scpu0);
        $elapsed = $ts_end - $ts_begin;
        print "$init took $ucpu u, $scpu s, $elapsed elapsed\n";
}
Received on Tue Jul 10 2001 - 04:37:45 NZST

This archive was generated by hypermail 2.4.0 : Wed Nov 08 2023 - 11:53:42 NZDT