Re: Directories with 2million files

From: Eric Anderson <anderson_at_centtech.com>
Date: Thu, 22 Apr 2004 11:27:15 -0500
David O'Brien wrote:

>On Wed, Apr 21, 2004 at 10:11:04AM -0500, Eric Anderson wrote:
>  
>
>>Doing  'ls -f' works, but still manages to munch up about 260MB of ram, 
>>which runs since I have enough, but otherwise would not.
>>    
>>
>
>It used 260MB of VM, not physial RAM.  Even with less in your machine, it
>would have worked fine -- no one is going to have less than than much
>virutal memory (i.e., swap) if they run Netscape on the same machine.
>
Ok - here's the snippet from 'top':

Mem: 268M Active, 147M Inact, 155M Wired, 32M Cache, 86M Buf, 144M Free
Swap: 1024M Total, 2356K Used, 1022M Free

  PID USERNAME PRI NICE   SIZE    RES STATE    TIME   WCPU    CPU COMMAND
36102 anderson 132    0   263M   263M RUN      0:02 68.63%  9.57% ls
36103 anderson 119    0  1180K   560K RUN      0:00  5.60%  0.78% wc

However, I'm not sure about the Netscape comment - I don't really know 
what you are referring to, but I'd guess most likely a person with 
2million files in one directory isn't going to be running Netscape on it 
anyhow. 

Anyway, the real issue is du I believe.  Not being able to du a 
directory with 2million files seems like a bad thing.

Eric


-- 
------------------------------------------------------------------
Eric Anderson     Sr. Systems Administrator    Centaur Technology
Today is the tomorrow you worried about yesterday.
------------------------------------------------------------------
Received on Thu Apr 22 2004 - 07:27:30 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:52 UTC