Friday, March 19, 2010

Continuos test, rapid deployment, etc

There has been some progress when it comes to testing at work. It's a pretty huge step from "no tests" to a full fledge testing everything big huge system. There are a huge number of types of tests that can be made. Everything from unit testing (testing individual methods and mocking the world around them) to ad hoc testing to integration testing, to smoking testing, to system tests. Test of frameworks, of UIs, of applications, of databases, of networks, of performance, of usability, of anything and everything.

So how could you expect to go from zero to full fledge software state of the art goodness? With an existing code base never the less! Well, you can't :) Won't happen! But what I am seeing is something positive never the less, because if one set of tests are created, that does something, it can inevitably show that tests are *good*. That they have some form of payback. Which can prove that investing in the right kind of test is a great improvement and a gain.

So what is the right kind of test you might ask? One that finds something wrong with your software so that you can fix it, have more confidence in your code, and costs less than the gains. No point in expensive tests that doesn't find anything wrong!

One thing among many that tests can lead to is something I personally think is a good thing™ Always having shipping quality. I think that by always having a branch (or as they are sometimes called, stream) that is fully tested and ready to ship gives rapid development a new meaning. Any new development will have to reside on a different branch until ready (and thus having tests) and then integrated.

So what is so good about it? Think of how things usually go when it is time to ship to the customer. Tests are inevitably done at the end, and fails, or deployment fails, or the customer finds something hideously wrong. If you automatically test your software all the time, and only have finished features in it, you can any day ship it off to a customer with confidence. That's pretty cool to me. It also means that you have to have a state of the art deployment system. But that is a topic for another day:)

If anything in this little post seems a bit distracted it's because I am not feeling to well. I am in fact sick, which has affected my poor head. Trying to keep a thought up there among the cotton balls has proven almost futile. I might return to the subject some day, as I have more to say about it. And if there are things that seems to sound like it was agile don't fret, good ideas are good ideas, no matter the label or who thought of them:) And sharing ideas are good™


  1. Testing can be a strange animal. We have a sign up in our office... "Do you have tests? Are your tests valid? Do your tests pass? Are your changes in subversion?"

    Most of our "fall flat" moments are when the tests aren't valid. You can regression test all you want against an invalid test and it won't help a bit.

  2. Good point about tests. You have to not only maintain your software but also your tests, and if you have invested in the "wrong" tests the cost of maintaining them can skyrocket.

    Test cost is the topic of another (ranting) post, but I do think that test proponents sometimes forget about the test maintenance cost, either because they don't think about all kinds of tests, or because they are so excited about the whole thing. I think few choose to ignore it on purpose.

    I still want to try TDD for real, to see if I can do it.