Messages in this thread | | | Subject | Re: Slow DOWN, please!!! | From | Nigel Cunningham <> | Date | Thu, 01 May 2008 12:31:15 +1000 |
| |
Hi.
On Wed, 2008-04-30 at 18:40 -0700, Linus Torvalds wrote: > The thing is, the quality of individual patches isn't what matters! What > matters is the quality of the end result. And people are going to be a lot > more involved in looking at, testing, and working with code that is > merged, rather than code that isn't.
No. People generally expect that code that has been merged does work, so they don't look at it unless they're forced to (by a bug or the desire to make further modifications in that code) and they don't explicitly seek to test it. They just seek to use it.
When it doesn't work, some of us will go and seek to find the cause, others (most?) will simply roll back to whatever they last found to be reliable.
Out of tree code has the same issues.
The only time code really gets looked at and tested is when there's a problem, or when people are explicitly choosing to inspect it (pre-merge reviews, eg).
So my answer to the "how do we raise quality" question would be that when writing the code, we put time and effort into properly analysing the problem and developing a solution, we put time and effort into carefully testing the solution, and we put code in that will help the end-user help us to debug issues later (without them necessarily needing to git-bisect). After all, good software isn't the result of random (or semi-random), unconsidered modifications, but of planning, thought and attention to detail.
In other words, I'm arguing that the speed of merging should be irrelevant. What's relevant is the quality of the work done in the first place.
If you want better quality code, penalise the people who get buggy code merged. Give them a reason to get it in a better state before they try to merge. Of course Linus alone can't do that.
Nigel
| |