Realistically achieving high test coverage – MvcContrib

Since Eric Hexter and I started the MvcContrib project, we’ve mandated a high test coverage.  If a patch comes without tests, we’d reject the patch. 

Given that MvcContrib exists for the purpose of supplementing a presentation library, ASP.NET MVC, you might think it’s not possible to achieve such a high percentage.

If you were ever curious about how this is done, I invite you to take a look at the project.  The project has 1058 tests at this point and the main MvcContrib.dll assembly has 99% test coverage. 

As an aside, when code is test-driven, the test coverage naturally falls out of this. 

By the way, Eric just released version 0.0.1.118 of MvcContrib to CodePlex.

 

Keep tabs on MvcContrib by following my feed:  http://feeds.feedburner.com/jeffreypalermo

Objectively evaluating O/R Mappers (or how to make it easy to dump NHibernate)

I’m amazed that there is so much talk about object/relational mappers these days.  Pleased, but amazed.  I tend to be in the “early adopter” part of the Rogers technology adoption curve. (Subscribe to my feed:  http://feeds.feedburner.com/jeffreypalermo)

In the .Net world, I didn’t hear much talk about O/R Mappers in the early 2000s.  I started working with NHibernate in 2005 while on a project with Jeremy Miller, Steve Donie, Jim Matthews, and Bret Pettichord.  I researched, but never used, other O/R Mappers available at the time.  Now, in 2008, I find that O/R Mappers in the .Net world are still in the early adopter part of the adoption curve.  We have not yet hit early majority, but we have left the innovators section.

Microsoft has single-handedly pushed O/R Mapping to the center of conversation, and we struggle to objectively differentiate between the choices.  Arguments like “Tool X rocks”, or “Tool Y sucks” are hard to understand.  I’d like to more objectively discuss the basis on which we should accept or reject an O/R Mapper.  As always, it depends on context. 

Context 1:  Small, disposable application:  In this case, we would put a premium on time to market while accepting technical debt given the application has a known lifespan.  For this type of of situation, I think it depends on the skill set of the team we start with.  If the team already knows an O/R Mapper, the team should probably stick with it since the learning curve of any other tool would slow down delivery. 

Context 2:  Complex line-of-business application:  Here, the business is making an investment by building a system that is expected to yield return on the engineering investment.  The life of the application is unbounded, so maintainability is king.  We still want to be able to build quickly, but long-term cost of ownership has a heavy hand in decisions.  Here, we have to objectively think about the tools used by the system.

I’ll use O/R Mappers in this example.  On the right is a common visual studio solution structure.  We would probably leverage the O/R Mapper in the DataAccess project.  I consider the O/R Mapper to be infrastructure since it doesn’t add business value to the application.  It merely is plumbing to help the application function.  By following the references, we find that our business logic is coupled to the data access approach we choose as well as infrastructure we employ.  Often we can build the system like this, and we can even keep the defect rate really low.  This is very common, and I’d venture to guess that most readers have some experience with this type of structure.  The problem with this structure is long-term maintainability.  In keeping with the O/R Mapper decision, five years ago, I was not using NHibernate.  If I ask myself if I’ll be using NHibernate five years from now, I have to assume that I probably won’t be, given the pace of technology.  If this system has a chance of maintainability five years from now, I need to be able to upgrade parts of the system that are most affected by the pace of technology, like data access.  My business logic shouldn’t be held hostage by the data access decision I made back in 2008.  I don’t believe it’s a justified business position to say that when technology moves on, we will rewrite entire systems to keep up.  Sadly, most of the industry operates this way. 

On the left is the general solution structure I’m more in favor of.  You see that the core of the application doesn’t reference my other projects.  The core project (give it whatever name you like) contains all my business logic, namely the domain model and supporting logical services that give my application its unique behaviors.  Add a presentation layer for some screens, and the system delivers business value.  Here, you see I’ve lumped data access in with infrastructure.  Data access is just not that interesting, and system users don’t give a hoot how we write to the database.  As long as the system is usable and has good response times, they are happy.  After all, they are most happy when they are _not_ using the system.  They don’t spend their leisure time using our system.

I consider data access to be infrastructure because it changes every year or two.  Also consider communication protocols like ASMX, remoting, WCF to be infrastructure.  WCF, too, will pass in a year or 10 for the next wave of communication protocols that “will solve all our business problems”.  Given this reality, it’s best not to couple the application to infrastructure.  Any application today that is coupled to Enterprise Library data access will likely have to be completely rewritten in order to take advantage of any newer data access method.  I’d venture to say that the management that approved the budget for the creation of said system didn’t know that a rewrite would be eminent in just 4 short years.

How do we ensure the long-term maintainability of our systems in the face of constantly changing infrastructure?  The answer:  Don’t couple to infrastructure.  Regardless of the O/R Mapping tool you choose, don’t couple to it.  The core of your application should not know or care what data access library you are using.  I am a big fan of NHibernate right now, but I still keep it at arms length and banished to forever live in the Infrastructure project in the solution.  I know that when I want to dump NHibernate for the next thing, it won’t be a big deal

How do I ensure I’m not coupled to my O/R Mapper?

  • The project my domain object reside in doesn’t have a reference to NHibernate.dll or your O/R Mapper of choice
  • The unit tests for my domain model don’t care about data access
  • My domain objects don’t have specific infrastructure code specific to the O/R Mapper

The key is in the flipped project reference.  Have the infrastructure project reference the core, not the other way around.  My core project has no reference to NHibernate.dll.  The UI project has not reference either.  Only in the infrastructure project.

Keep it easy to dump NHibernate when its time has come

For now, NHibernate is the O/RM of choice in .Net-land.   When it’s time comes, don’t go to management and recommend a rewrite of the system because it’s completely tightly-coupled to NHibernate.  Keep NHibernate off to the side so you can slide in the next data access library that comes along.  If you tightly couple to your O/RM, you’ll sacrifice long-term maintainability.

When choosing an O/R Mapper:  The objective criteria I think is most compelling is to determine of the library allows isolation.  If the tool forces you to create an application around it, move on for a better one.  The good libraries stay out of the way.  If your O/R M always wants to be the center of attention, dump it for one that’s more humble.  Use an O/R M that plays well behind a wall of interfaces.  Beware the O/R M that doesn’t allow loose coupling.  If you tightly couple, it’s a guaranteed rewrite when you decide to change data access strategies.