remedies for frustration
with speed, quality, and reviews
Martin Pool
linux.conf.au 2013
(open the speaker notes)
performance
<jelmer> rewriting is often a good idea,
it just never works out
design for scaling
deterministic effort tests
performance test tips
tc qdisc to locally simulate a predictable slow network.
/proc/sys/vm/drop_caches to reduce vm variation.
fast Python
be careful of caches
quality
tested fakes
Key subsystems provide their own fake implementation.
Use testscenarios to run the same tests against all implementations.
Done Done
A ramp for testing
high-speed delivery
<jml> coding leads to deployment; deployment leads to waiting; waiting leads to anger; anger leads to ...
look for latency before work is
- reviewed
- merged to mainline
- into a release
- built into binaries
- on the production server
- in the OS or auto-update channel
- supported by all related systems
- discovered by users
- working properly...
ordering ≫ scheduling
bug management
bugs: now, soon, never
black and white bugs
community
a model of developer growth
a bad code review
the purpose of review
to ensure people do not add trailing whitespace?
enjoyable first patches
Prompt, helpful reviews.
Don't move the goalposts.
Not too much refactoring.
Prefer integration tests.
patch piloting
but, but...!
go and try these:
thanks!
lifeless, jml, flacoste, jelmer, jameinel, statik, vila, gz,
spiv, ianc, cinerama, abentley, stevea, sabdfl, ...
stable APIs or stable code?