Directories with 2million files

From: Eric Anderson <anderson_at_centtech.com>
Date: Wed, 21 Apr 2004 08:42:53 -0500
First, let me say that I am impressed (but not shocked) - FreeBSD 
quietly handled my building of a directory with 2055476 files in it.  
I'm not sure if there is a limit to this number, but at least we know it 
works to 2million.  I'm running 5.2.1-RELEASE.

However, several tools seem to choke on that many files - mainly ls and 
du.  Find works just fine.  Here's what my directory looks like (from 
the parent):

drwxr-xr-x   2 anderson  anderson  50919936 Apr 21 08:25 data

and when I cd into that directory, and do an ls:

$ ls -al | wc -l
ls: fts_read: Cannot allocate memory
       0

Watching memory usage, it goes up to about 515Mb, and runs out of memory 
(can't swap it), and then dies. (I only have 768Mb in this machine).

du does the exact same thing.

find, however, works fine (and is very fast!):
$ time find .  | wc -l
 2055476

real    0m3.589s
user    0m2.501s
sys     0m1.073s


I'd work on some patches, but I'm not worth much when it comes to C/C++. 

If someone has some patches, or code to try, let me know - I'd be more 
than willing to test, possibly even give out an account on the machine.

Eric





-- 
------------------------------------------------------------------
Eric Anderson     Sr. Systems Administrator    Centaur Technology
Today is the tomorrow you worried about yesterday.
------------------------------------------------------------------
Received on Wed Apr 21 2004 - 04:43:05 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:51 UTC