Re: Directories with 2million files

From: Peter Jeremy <PeterJeremy_at_optushome.com.au>
Date: Sat, 24 Apr 2004 10:12:31 +1000
On Fri, Apr 23, 2004 at 08:27:22AM -0500, Eric Anderson wrote:
>Resource limits (current):
> datasize           524288 kb
>
>Ouch!  That seems pretty low to me.  1gb would be closer to reasonable 
>if you ask me, but I'm nobody, so take it with a grain of salt.

Why do you feel this is low?  The intent of this limit is to stop a
runaway process eating all of RAM and swap.  It is probably reasonable
as a default for a workstation (X, KDE, Mozilla & OpenOffice are each
unlikely to exceed 512MB during normal use) or server.  People with
atypical requirements will need to do some tuning.

I agree that the defaults mean you can't run 'ls' on a directory with
2e6 files but this isn't a typical requirement.

Upping the default limit to 1GB increases the risk that a runaway process
will make the machine unusable (think how your machine with 768MB RAM
would behave if you increased datasize to 1GB and tried to run ls on a
directory with just under 4e6 files).

As for ls(1), its theoretical memory requirements are of the order of
32 bytes per entry plus the size of the directory in order to run
'ls -lsio'.  It should be reasonably easy to remove the need to store
anything if you don't require sorting or column alignment but beyond
that, the code complexity starts to increase significantly.

-- 
Peter Jeremy
Received on Fri Apr 23 2004 - 15:12:52 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:52 UTC