Re: Directories with 2million files

From: Eric Anderson <anderson_at_centtech.com>
Date: Wed, 21 Apr 2004 15:51:38 -0500
masta wrote:

> Garance A Drosihn wrote:
>
>> At 3:09 PM -0500 4/21/04, Eric Anderson wrote:
>>
>>> Garance A Drosihn wrote:
>>>
>>> I suppose this is one of those "who needs files bigger than 2gb?"
>>> things..
>>
>>
>>
>> Perhaps, but as a general rule we'd like our system utilities to
>> at least *work* in extreme situations.  This is something I'd
>> love to dig into if I had the time, but I'm not sure I have the
>> time right now.
>>
> I'm not sure how we can improve this situation. Considering that an 
> `ls -l` is forced to stat every file, and store that info until the 
> time comes to dump it to the tty for the human operator. The problem 
> seems somewhat geometric, and un-fixable unless you want to find a way 
> to page out the stat information of each file to a dump file of some 
> sort, then cat that info back to the operator upon conclusion of the 
> main loop. Even then, list 2 million files will be excesive just 
> storing the file names for display.


Bare minimum - du should work, if you ask me.  ls is almost a separate 
issue - the only time you need to 'ls' a directory with that many files 
is maybe if you needed to use them in a script I suppose.  I did it out 
of curiousity mostly, but du is an essential tool in this case..

Eric

-- 
------------------------------------------------------------------
Eric Anderson     Sr. Systems Administrator    Centaur Technology
Today is the tomorrow you worried about yesterday.
------------------------------------------------------------------
Received on Wed Apr 21 2004 - 11:51:52 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:52 UTC