Re: Directories with 2million files [ CORRECTION ]

From: Matt Emmerton <matt_at_gsicomp.on.ca>
Date: Wed, 21 Apr 2004 16:58:41 -0400
> > At 3:09 PM -0500 4/21/04, Eric Anderson wrote:
> > >Garance A Drosihn wrote:
> > >
> > >>... If you really do want the -l (lowercase L)
> > >>instead of -1 (digit one), it *might* help to add the -h option.
> > >
> > >Used 263MB, before  returning the correct number.. It's functional,
> > >but only if you have a lot of ram.
> >
> > Darn.  Well, that was a bit of a long-shot, but worth a try.
> >
> > >>Another option is to use the `stat' command instead of `ls'.
> > >>One advantage is that you'd have much better control over
> > >>what information is printed.
> > >
> > >I'm not sure how to use stat to get that same info.
> >
> > Oops.  My fault.  I thought the `stat' command had an option to
> > list all files in a given directory.  I guess you'd have to combine
> > it with `find' to do that.
> >
> > >>>du does the exact same thing.
> > >>
> > >>Just a plain `du'?  If all you want is the total, did you
> > >>try `du -s'?  I would not expect any problem from `du -s'.
> > >
> > >$ du -s
> > >du: fts_read: Cannot allocate memory
> >
> > Huh.  Well, that seems pretty broken...
> >
> > >>>I'd work on some patches, but I'm not worth much when it comes
> > >>>to C/C++.   If someone has some patches, or code to try, let me
> > >>>know - I'd be more than willing to test, possibly even give out
> > >>>an account on the machine.
> > >>
> > >>
> > >>It is probably possible to make `ls' behave better in this
> > >>situation, though I don't know how much of a special-case
> > >>we would need to make it.
> > >
> > >
> > >I suppose this is one of those "who needs files bigger than 2gb?"
> > >things..
> >
> > Perhaps, but as a general rule we'd like our system utilities to
> > at least *work* in extreme situations.  This is something I'd
> > love to dig into if I had the time, but I'm not sure I have the
> > time right now.
>
> A quick inspection of the code (on my 4.9-REL-p2 machine) shows that
there's
> a memory leak.
> The following malloc() does not have a matching free().  I looked at the
> latest version in CVS (rev 1.76) and this hasn't been fixed.
>
>     537                 /* Fill-in "::" as "0:0:0" for the sake of scanf.
*/
>     538                 jinitmax = initmax2 = malloc(strlen(initmax) * 2 +
> 2);
>
> We need to add a free() for this; here's my suggestion.
>
>     596                 maxinode = makenines(maxinode);
>     597                 maxblock = makenines(maxblock);
>     598                 maxnlink = makenines(maxnlink);
>     599                 maxsize = makenines(maxsize);
>     600
>     601                 free(jinitmax); /* MTE */
>     602
>     603         }

Reading the code a bit more carefully, this won't fix the problem Eric is
seeing, but it still is a problem for anyone who has "LS_COLWIDTHS" defined
in their environment.

--
Matt Emmerton
Received on Wed Apr 21 2004 - 12:02:06 UTC

This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:37:52 UTC