Avoid retrofitting unit tests
I was asked this a few days ago, should we spend time creating proper unit tests for our code?
The team in question already have a set of integration tests, but very few tests that qualify as unit tests.
My response was rambling and long, but it can be put down to the following statement: TDD is a design technique, not a testing technique. TDD, and especially test first, have the nice side affect of leaving tests as part of the process, which can be incredibly helpful when you are working with the code. But retrofitting tests? That tends to be a waste of time.
Writing a unit test before touching the code is absolutely the way to go, but going and adding unit tests, as a task of its own? I don't see the value in that.
If you have integration tests there, that tends to be good enough, and you will write unit tests when you change the code, so eventually you'll have enough unit tests ( eventually you will have enough unit tests on the hot spots ).
Comments
I would say that I agree on the part of writing unit-test only as a purpose to change/write new code.
Ayende,
It's hard for me to disagree with this more. One of the greatest benefits of unit tests is the ability to see when something you refactored caused a breaking change. Now your theory works if you always write tests around what you refactored, but if you don't have tests around all consumers of what you refactored, how do you know if you broke something there?
Integration tests typically follow a bit of a happy path. Honestly, there are too many paths to follow for full integration tests. All of the paths at each layer need to be added to each other and tested in combination with the others. Proper unit tests allows you to isolate just the pieces being tested.
Plus, if someone is new to a project, sometimes it is actually helpful to write unit tests (if they don't already exist) on the existing code as a way to learn how it works and what it does. I did that for a project before. You read the code, have a good understanding of what the program does, and the scenarios that it is handling, and then write some tests to verify what you interpreted it as. Sometimes the two don't line up, and it's actually a valuable learning experience.
I wouldn't advise going through and writing tests for everything. But I would write tests larger than just my changes when I was working in a TDD manner. I would probably make sure that the consumers of what I'm changing have tests to verify the handling of the various scenarios offered by my unit.
I agree, the biggest point about TDD (at least for me), is NOT that you will end up with lots of tests that validate that your application works. This is a very nice side effect.
Trying to write tests for a very small part of your application in isolation, and writing tests before code, will affect the DESIGN of your application. This is the most important aspect of TDD (for me).
I guess I'm a little confused that they asked you about /unit testing/ and you responded about /TDD/. I do mostly agree with you - writing unit tests without a purpose in mind seems wasteful. But if you are writing them to either make changes to them, expose bugs that have been reported, or to ensure that it works when dependencies change, then I see that as beneficial (and probably closer to TDD anyway, since you are writing those before you write other code).
The problem I've found with things like characterization tests is that people can get a false sense of security. With TDD, you don't have any production code written without a failing test. When you are retrofitting, you really don't know how much further you have to go.
I would agree with the statement assuming that you define the retrofitting of the unit tests to be a 'Big Bang' project who's goal is writing in unit tests that were not there to begin with.
I am however a big fan of - if you are modifying code in a particular area of a legacy system - WRITE A TEST for it. Possibly not the entire thing, but at least the method(s) you are working on. It lowers the barrier to entry for anyone following in your foot steps. If the other programming team members are like you - then they would also expand on the testing if they are working in other methods in the same class as you.
The objective is to make testing EASY and natural - something you do as habit. A test already in place is harder to ignore then one that doesn't exist at all.
So Big Bang - Write a bunch of tests to fill in - No
As you refactor and work in the code ABSOLUTELY!
I agree with the various comments that creating tests when changing code that does not have test has some merit.
The biggest problem I have with the team is acceptance of the fact that a better design is easier to test.
My dev team is proceeding on all future projects with the statement that "Testability is a requirement". Once the team members internalize and accept this they will attempt to archive it with the least pain possible. This will lead them to highly decoupled designs and separation of concerns in order to make their lives easier. eg. better designs.
Getting them to write tests first, well we are working on that. These types of mental shifts don't happen over night.
On a side note: could you please add OpenID for comments?
In system with no unit tests, you have three opportunities for adding unit tests:
When adding new functionality
When refactoring a problem area
When reproducing a bug
Feathers recommends a similar approach. It helps target your strategy for adding unit tests. Just trying to sprinkle unit tests around isn't testing, it's characterization.
Thank you for writing this post!
I recently had an experience of putting together a small utility that started as a sample code learning exercise. When I decided I was going to keep the code I added a test project and such, but as I went and retro-tested what had been done it felt wrong. I thought of it as development driven testing (DDT).
About the only real value that came from the tests was one or two places where changes in code to make it more testable improved the separation of concerns. Beyond that I've added tests for new changes and troubleshooting, which feels natural.
But that leaves me with a question - have you found tdd to be efficient enough to use on a learning exercise, sample code, or demo site? If you're not thinking the code will ever "be real" do you start by shooting from the hip and add testing in the cases it hits a critical mass?
I guess asking the question answers itself. Keeping experiments is probably a rare case, but for code you're going to keep if it's not worth testing from day one it's not worth writing.
You make some really good points there. I absolutely agree with you. I recently has a discussion with a couple of my co workers which was along the same lines.
I'll just chime in with agreement to everyone who thought there is value to retrofitting unit tests. The codebase i'm working on is fairly large and was originally developed without TDD. However we care constantly expanding it and refactoring things that no longer make sense. To do this we've been adding unit tests as we interface old classes and refactor them because without these retrofitted tests, moving forward with TDD would be impossible.
TDD != Unit Tests
TDD is a development methodology that makes use of unit tests. In other words, TDD depends on unit tests, but unit tests do not necessarily depend on TDD. There is great value to have in using unit tests without TDD. Unit tests let you verify that your code still works when you: a) add code, b) refactor code, c) remove code, d) all of the above. Just because you are adding new code, even when it should be uncoupled with existing code, doesn't mean that it won't break existing code when interactions start happening. Unit tests would help you find those issues, no matter when you wrote those tests.
TDD is not always the right choice for every project/team/person but I would dare to say that there is always a benefit to having unit tests.
An interesting point. I recently learned about build automation and code analysis, and have now turned my eye towards unit tests. I have an existing project of some size that could do with some automated tests, and was wondering just what it would take to get it covered with tests.
I suspect I will do it one item at a time, starting with the more heavily-used parts of the code. The suggestion to write tests for new or updated code is a good one, and I think I am going to do that.
Louis,
For demos, no.
For discovery, yes
I have to agree with the above comment. Writing unit-tests is not TDD and writing unit tests does not make you a great coder. When it boils down to it, a unit test is simply more code and is only as good as the developer writing it. Unfortunately I have seem plenty of stuff like this:
method:
void DoStuff()
{
}
and the cooresponding test is:
void DoStuff_Test()
{
DoStuff();
}
No validation of the data, simply the assumption that an exception will be thrown will mean the world is fine.
This post essentially inspired me to get some unit tests into my project, specifically for parts I was planning to refactor. I now have a few dozen tests in place for those parts, so I should be able to refactor with confidence. Wash, rinse, repeat I guess. The theory seems to be that with enough time, and enough planned changes, a decent portion of the project should eventually be under test.
Comment preview