On Tue, Oct 30, 2007 at 10:23:58AM +0000 I heard the voice of Alex Zbyslaw, and lo! it spake thus: > > Of course, with modern systems where nroff-ing a man page takes > negligible time and system resources, it could also be argued that > cat-ed man pages should be a thing of the past :-) Quite. The slowest machine I currently have running (to get slower, I'd have to dig in my closet) is my laptop, which is a P54 Pentium 133MHz, with 32 megs of RAM and a hard drive that runs in PIO mode. It's running a 2002-vintage RELENG_4, on which the largest manpage is perlfunc(1) (at 71k). On the first run without the manpage in cache: % time sh -c 'man perlfunc > /dev/null' 6.881u 0.204s 0:07.22 98.0% 173+581k 8+0io 0pf+0w A while, but hardly an eternity. A more typical manpage like ls takes 3 seconds. On a less ancient machine (but still a few generations back; Athlon 1.25GHz, few month old RELENG_6), the biggest manpage is perltoc(1) at 150k. A cold cache run there takes just over 2 seconds. On my workstation (dual Athlon 1.4, HEAD), I've got wireshark-filter(4) at a whopping 746k. That takes about 8 seconds. Second place is gcc at 158k, which takes about 1. So, yes; outside of rather special cases, catpages deserve to enjoy their retirement at this point 8-} -- Matthew Fuller (MF4839) | fullermd_at_over-yonder.net Systems/Network Administrator | http://www.over-yonder.net/~fullermd/ On the Internet, nobody can hear you scream.Received on Wed Oct 31 2007 - 14:12:54 UTC
This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:39:20 UTC