Re: Directories with 2million files

From: masta <diz_at_linuxpowered.com>
Date: Wed, 21 Apr 2004 15:46:25 -0500
Garance A Drosihn wrote:

> At 3:09 PM -0500 4/21/04, Eric Anderson wrote:
>
>> Garance A Drosihn wrote:
>>
>> I suppose this is one of those "who needs files bigger than 2gb?"
>> things..
>
>
> Perhaps, but as a general rule we'd like our system utilities to
> at least *work* in extreme situations.  This is something I'd
> love to dig into if I had the time, but I'm not sure I have the
> time right now.
>
I'm not sure how we can improve this situation. Considering that an `ls 
-l` is forced to stat every file, and store that info until the time 
comes to dump it to the tty for the human operator. The problem seems 
somewhat geometric, and un-fixable unless you want to find a way to page 
out the stat information of each file to a dump file of some sort, then 
cat that info back to the operator upon conclusion of the main loop. 
Even then, list 2 million files will be excesive just storing the file 
names for display.

-Jon
Received on Wed Apr 21 2004 - 11:46:58 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:52 UTC