lkml.org 
[lkml]   [2008]   [Apr]   [30]   [last100]   RSS Feed
Views: [wrap][no wrap]   [headers]  [forward] 
 
Messages in this thread
/
Date
From
SubjectRe: Slow DOWN, please!!!
On Wed, Apr 30, 2008 at 10:03 AM, David Miller <davem@davemloft.net> wrote:
>
> This is starting to get beyond frustrating for me.
>
> Yesterday, I spent the whole day bisecting boot failures
> on my system due to the totally untested linux/bitops.h
> optimization, which I fully analyzed and debugged.
>
> Today, I had hoped that I could get some work done of my
> own, but that's not the case.
>
> Yet another bootup regression got added within the last 24
> hours.
>
> I don't mind fixing the regression or two during the merge
> window but THIS IS ABSOLUTELY, FUCKING, REDICULIOUS!
>
> The tree breaks every day, and it's becomming an extremely
> non-fun environment to work in.
>
> We need to slow down the merging, we need to review things
> more, we need people to test their fucking changes!
> --

Just some comments:

Analogous to that of the football team, everyone has an impt role to
play. And u better let go of the ball as fast as u can, otherwise u
are going to tire yourself out easily.

So, in a development team, if u think there is some unequal
distribution of workload, make noise. Or think of some means to do
automatic loading of workload - specifically in the area of change
review. (At other times, it is not easily to pass the load
around.....eg, if the bug happened only on your machines and not on
others?)

1. Generally, the more people reviewed the work, the higher chances
the piece of work is ok.

2. If more variation of real-testing is done, the better.
"variation" here means testing by users of different background
skills, different applications running, and most impt - is the base
kernel version where the patch is applied and tested. etc.

3. Based on the two numbers above alone, we can immediately have
some measure of confidence of the patch - correct?

4. So if we can put all these in a web page - the patches itself,
the reviewers/testers that have worked on it.

When someone comes in and review, review counter increase by one. Or
tester counter increased by one after testing.

And I supposed everyone will attempt to cover those that are lesser
covered by others - automatic loading of workload done in a
distributed manner.

Avoid having to fill in too much information though...u will
discourage taking up the work, and let the participant spent more
precious time on reviewing instead.

So prior to consolidation of sources, just by looking at the numbers,
u can see how successful the consolidation will be. If it is lesser
tested, then avoid including it for consoldating......

Please comments......
--
Regards,
Peter Teoh


\
 
 \ /
  Last update: 2008-04-30 16:51    [W:0.281 / U:0.296 seconds]
©2003-2020 Jasper Spaans|hosted at Digital Ocean and TransIP|Read the blog|Advertise on this site