Dear all: As we're approaching 8.0 BETA3, now would be a really good time to start identifying functional and performance regressions from the 7.x series. We've done a mixed job at this in the past, but when it comes to "I'll do better some day", there's no time like the present :-). Some notes: Functional testing We actually have a sizable regression test suite in src/tools/regression -- some of these tools will have broken as a result of 8-CURRENT development, so the first task may be to fix them. The next is to run them on 7-STABLE and 8-CURRENT and decide if things have gotten worse -- or maybe we have a bug to fix in both. Pick the tool of your choice, and give it a spin. More than one person per tool is fine, because that way we get more diverse testing. Performance testing On the performance front, life is always a bit more tricky -- performance testing is a subtle art. However, the most clear lessons are that (a) testing with diverse workloads and diverse environments is extremely important, (b) you should do multiple runs and use ministat(1) to analyze results, and (c) that you want to compare apples with apples -- use the same hardware/configuration wherever possible. Watch out for annoying nits such as partition layout affecting I/O throughput for two different installs on the same disk. Pick something you think is important to you: bytes/sec over TCP on loopback, web hits/sec, NFS ops/sec, disk I/O transactions/sec, and do some comparison between 7.2 and 8.0. If you find improvement -- great! If you find a regression, please start a thread on current_at_ to help get it diagnosed. And if you want help doing performance measurement for a particular workload that isn't well studied, send some e-mail to performance_at_ to ask for advice on how best to measure it. Thanks, Robert N M Watson Computer Laboratory University of CambridgeReceived on Wed Aug 12 2009 - 10:29:36 UTC
This archive was generated by hypermail 2.4.0 : Wed May 19 2021 - 11:39:53 UTC