I found out today that my wife is with child, and we're having a baby! Now, on with the post. . .
Integration testing isn't your basic 200-level topic at an MSDN event. It can be very involved. I believe that a good integration test has to depend on targeted unit tests being present. Consider the scenario without unit tests:
Bob has a use case that spans 15 classes. He sets up the environment to get this slice of the system under test. He then proceeds to write the test with asserts. He quickly becomes frustrated because for each of the 15 classes along the way, there are different scenarios that are possible. If each class has just 2 possible uses, his number of scenarios to test are 2^15. Each scenario requires many assert statements. Faced with 32,768 test combinations, Bob is disgruntled and concludes that automated integration testing is too much overhead.
What did Bob do wrong? First, Bob attempted to start his automated testing at the integration level. Second, he assumed unit test responsibilities inside the integration test. Third, he tried to test every possible combination of integration. Fourth, he hadn't surrounded himself with a quality team that could help guide the testing strategy.
Here's the success scenario:
Bob has written unit tests for each of his 15 classes. He marvels at how simple they looks since each unit test only has to cover 2 usage scenarios for each class. With confidence that each individual class will do its job correctly, Bob writes an integration test for the use case choosing one of the many combinations that could occur. Bob sets up the test, executes it, and then asserts on the resulting state of the system. Bob finds an integration issues caused by how two of the classes interact with each other. He fixes that bug, and the test passes. Bob now has confidence that the 15 classes are interacting properly in his use case.
If you haven't already read the following from my friend, Jeremy Miller, take a minute to do so: