Re: Directories with 2million files

From: Eric Anderson <anderson_at_centtech.com>
Date: Wed, 21 Apr 2004 10:11:04 -0500
Tim Robbins wrote:

>On Wed, Apr 21, 2004 at 08:42:53AM -0500, Eric Anderson wrote:
>
>  
>
>>First, let me say that I am impressed (but not shocked) - FreeBSD 
>>quietly handled my building of a directory with 2055476 files in it.  
>>I'm not sure if there is a limit to this number, but at least we know it 
>>works to 2million.  I'm running 5.2.1-RELEASE.
>>
>>However, several tools seem to choke on that many files - mainly ls and 
>>du.  Find works just fine.  Here's what my directory looks like (from 
>>the parent):
>>
>>drwxr-xr-x   2 anderson  anderson  50919936 Apr 21 08:25 data
>>
>>and when I cd into that directory, and do an ls:
>>
>>$ ls -al | wc -l
>>ls: fts_read: Cannot allocate memory
>>      0
>>    
>>
>
>The problem here is likely to be that ls is trying to store all the
>filenames in memory in order to sort them. Try using the -f option
>to disable sorting. If you really do need a sorted list of filenames,
>pipe the output through 'sort'.
>  
>
Doing  'ls -f' works, but still manages to munch up about 260MB of ram, 
which runs since I have enough, but otherwise would not.  An ls -alf 
does not work (I assume because it is trying to sum the total bytes by 
all files, prior to printing the data).  I just noticed that find also 
eats up the same amount of memory before it prints the list. 

This perl script does it in about 2.5 seconds, with minimal memory:
opendir(INDEX_PATH,"./");
while ($file = readdir(INDEX_PATH)) {
        $count++;
}
print "$count\n";


Eric



-- 
------------------------------------------------------------------
Eric Anderson     Sr. Systems Administrator    Centaur Technology
Today is the tomorrow you worried about yesterday.
------------------------------------------------------------------
Received on Wed Apr 21 2004 - 06:11:17 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:51 UTC