Re: Directories with 2million files

From: Eric Anderson <anderson_at_centtech.com>
Date: Wed, 21 Apr 2004 15:09:55 -0500
Garance A Drosihn wrote:

> At 8:42 AM -0500 4/21/04, Eric Anderson wrote:
>
>> ... I'm not sure if there is a limit to this number, but at
>> least we know it works to 2million.  I'm running 5.2.1-RELEASE.
>>
>> However, several tools seem to choke on that many files - mainly
>> ls and du.  Find works just fine.  Here's what my directory looks
>> like (from the parent):
>>
>> drwxr-xr-x   2 anderson  anderson  50919936 Apr 21 08:25 data
>>
>> and when I cd into that directory, and do an ls:
>>
>> $ ls -al | wc -l
>> ls: fts_read: Cannot allocate memory
>>       0
>>
>> Watching memory usage, it goes up to about 515Mb, and runs out
>> of memory (can't swap it), and then dies. (I only have 768Mb in
>> this machine).
>
>
> An `ls -al' is going to be doing a lot of work, most of which you
> probably do not care about.  (Certainly not if you're just piping
> it to `wc'!).  Depending on what you are looking for, an `ls -1Af'
> might work better.  If you really do want the -l (lowercase L)
> instead of -1 (digit one), it *might* help to add the -h option.
> I probably should look at the source code to see if that's really
> true, but it's so much easier to just have you type in the command
> and see what happens...

Used 263MB, before  returning the correct number.. It's functional, but 
only if you have a lot of ram.

> Another option is to use the `stat' command instead of `ls'.
> (I don't know if `stat' will work any better, I'm just saying
> it's another option you might want to try...).  One advantage
> is that you'd have much better control over what information is
> printed.


I'm not sure how to use stat to get that same info.   It's not so much 
that I have to have this option, it's that I believe it should work, 
without gobbling hundreds of MB's of memory.  Also just for 
"information's sake".

>> du does the exact same thing.
>
>
> Just a plain `du'?  If all you want is the total, did you
> try `du -s'?  I would not expect any problem from `du -s'.


$ du -s
du: fts_read: Cannot allocate memory

>> I'd work on some patches, but I'm not worth much when it comes
>> to C/C++.   If someone has some patches, or code to try, let me
>> know - I'd be more than willing to test, possibly even give out
>> an account on the machine.
>
>
> It is probably possible to make `ls' behave better in this
> situation, though I don't know how much of a special-case
> we would need to make it.


I suppose this is one of those "who needs files bigger than 2gb?" things..

Eric


-- 
------------------------------------------------------------------
Eric Anderson     Sr. Systems Administrator    Centaur Technology
Today is the tomorrow you worried about yesterday.
------------------------------------------------------------------
Received on Wed Apr 21 2004 - 11:10:09 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:51 UTC