Re: Directories with 2million files

From: Brad Knowles <brad.knowles_at_skynet.be>
Date: Fri, 23 Apr 2004 02:37:28 +0200
At 4:31 PM -0400 2004/04/22, Robert Watson wrote:

>  Unfortunately, a lot of this has to do with the desire to have programs
>  behave nicely in ways that scale well only to a limited extent.  I.e.,
>  sorting and sizing of output.  If you have algorithms that require all
>  elements in a large array be in memory, such as sorting algorithms, it's
>  inevitably going to hurt.

	Sorting routines do not necessarily have to keep everything in 
memory.  There have been sorting routines from the 1940s & 1950s that 
were designed for low-memory operations.  There are many databases 
today that have data greatly exceeding the memory requirements of any 
machine that we could possibly build -- at least, with current 
technology.  On some of them, even the indexes greatly exceed memory 
requirements.

	However, you do need to be able to know when to switch to such 
algorithms.  I believe that this might require some external support, 
such as indexes within the filesystem.  Depending on the 
implementation, this might require changes from all applications 
which have moderate or deep interaction with the filesystem -- which 
is a real problem.

-- 
Brad Knowles, <brad.knowles_at_skynet.be>

"They that can give up essential liberty to obtain a little temporary
safety deserve neither liberty nor safety."
     -Benjamin Franklin, Historical Review of Pennsylvania.

   SAGE member since 1995.  See <http://www.sage.org/> for more info.
Received on Thu Apr 22 2004 - 15:58:53 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:52 UTC