Most software development teams are now Agile ... or at least they're calling themselves Agile. In addition, I see many claiming to practice all the right techniques. I often find, however, that when you look behind the scenes they are making fundamental mistakes in their implementation.

Continuous Integration is arguably the most important practice you need to be Agile. If you're not constantly integrating new functionality into the existing system, and verifying that it doesn't break the existing functionality, then you are going to struggle to deliver incrementally.

Build on the Developer's Machine

A not uncommon mistake is to put the automated build on a server ... and only on the server. I've seen teams spend time getting the 'build server' set up in such a way that it's the only machine in the department that can actually run the build (and the tests).

This sometimes comes from the misunderstanding that the build tool is not the scheduler that runs on the server (e.g., CCNet). It's usually a command-line driven tool like NAnt or MSBuild. (This kind of misunderstanding can lead to a discussion like the following: link.)

Incidentally, I'm a huge fan of CCNet, and have used it almost everywhere I've worked. However, as James Shore has pointed out before, people can have a tendency to be distracted from the actual practice of Continuous Integration in favour of getting a build server working.

Integrate with ALL Code

So you've got the build going, and people are running the build locally before checking in. So now you're practising Continuous Integration, right? Well ... maybe.

Despite TDD not being a new technique, I think it is (unfortunately) still far from mainstream. A team might try to introduce (or increase the amount of) automated testing but, as James Shore points out, it is fundamental that the entire team agrees (see step 5 here) to practice Continuous Integration. Not only important that they all buy into the practice, but that they all understand the practice and what it entails.

When the developers adds a new feature to the existing code, they typically have to modify an existing implementation. They will try hard (sometimes very hard) not to break the existing implementation. Unfortunately, some developers might not extend that care to the tests.

I've seen developers treat tests as 'special' code that is immune to these rules. It is sometimes seen as acceptable to ignore tests (using the [Ignore] attribute), remove the [Test] attribute completely so they don't even get reported as ignored, or worst of all comment them out or delete them! The effect of this has been described as Test Cancer.

The excuse is often that it was 'pragmatic' ... where 'pragmatic' is used as a euphemism to being reckless or irresponsible. If you're practising Continuous Integration properly, then you should be integrating with all the existing code, and that includes the tests.

The Implementation as a Second-Class Citizen

Accepting that tests should be treated with the same priority as the implementation isn't far enough in my opinion. For those practising TDD the tests are significantly more important than the implementation.

Each test that gets added to the solution acts as an invariant, and might change little during the development of the product. It is the implementation that changes constantly to accommodate each new requirement as the project progresses. It is the implementation where pragmatic approaches like faking and hard-coding are acceptable until there is a test that demonstrates the need for it to evolve past the simplest solution.

You can see this effect in action very clearly while watching one of Uncle Bob's Katas (e.g., the Prime Numbers Kata); the tests, once written, are static, but the implementation is open to constant redesign/rework. (Note, I'm avoiding the much abused term Refactor here deliberately).

Once you've embraced the tests and are continually integrating with them, the implementation becomes a secondary concern that is an output designed (and redesigned) to satisfy the all tests. The implementation becomes the second-class citizen that is open to faking and hard-coding, while the battery of tests mean "you just don't have to worry where you walk".


In order to say you are practising Continuous Integration correctly, the following should be true:

  • The whole teams is collectively agreeing to the practice;
  • The automated build (including tests) must be runnable on a developer's machine;
  • The tests are treated with at least the same priority as the implementing code;
  • There is nothing more important than a broken build!

Do not think of tests as second-class citizens in the code. Instead strive towards the tests being the input, and the implementation becoming an output that passes the tests.

Finally, I don't think I've said anything new here; just reiterated many people's well worn points that have stood the test of time. I firmly believe that ignoring these points can cause a sharp decline in project quality, and that teams should bear them in mind before claiming to be Agile, or claiming they are practising Continuous Integration.

Submit this story to DotNetKicks Shout it


While I agree with the overall sentiment, I will grit my teeth at
the citation of Bob Martin and his laughably trivial Code Katas. It's always easy to write nice clean unit tests in a TDD fashion when building tiny little example applications, but I think the reality of building complex systems with evolving requirements muddy the waters somewhat.

When he puts up a screen cast where he builds a multi threaded TCP server using the same approach, then I'll give him some dues :)

Hi Johnboy,

I think the point of the Katas is to demonstrate the principal (i.e., the overall sentiment you agree with) of unit testing.

I agree they don't show real-world examples of unit-testing.


Hmm, but I think the Katas fail to show the principle fully, or at worse mislead, because in reality life is not so clean cut as it is when writing a Prime numbers calculator :)

Yes, I would love to see lots more real-world examples on the web of testing against more difficult dependencies (databases for example).

These sort of examples can be harder to find.


You're right that Code Katas are (by design!) "laughably trivial". Just like everything that somehow serves as an introduction, practicing and/or demonstration - how could that be not the case. As your project gets more complicated, your tests also will become more complicated to implement, that's just a trivial truth of life (how could that be "misleading", everyone knows that...).
But it remains true in all cases that you have to care about your tests first in order to produce decent code...

- Thomas