What are the Alt.Net principles? – my answer

I’ll be attending the AltNetConf.  Convenient for me that it’s in Austin, TX.  It’s an open space conference, and I consider it the founding conference of a conversation that is “Alt.Net”.  I’ll be proposing the topic: What are the Alt.Net principles?.

The definition of Alt.Net isn’t defined yet.  It’s not even at a point where I can explain what it is and actually have other agree with me.  

Since I know the guy who originally coined the term, I have some insight into what David meant, but I’m going to propose some principles that, together, should be the definition of Alt.Net.

First, Alt.Net inherits Agile, and IS Agile.  Therefore:

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan 

Extended principles:

  • Excellence and quality
    In a world of wizard-generated, disposable software (disposed 2 years later by necessity, not choice), Alt.Net focuses us on excellence in the software we create.  While the world may pay for and accept software that lives for 2 years before becoming utterly unmaintainable, we don’t accept shoddy work.  We know that we can deliver high quality software faster than others can deliver low quality software, so we accept no less than the highest in quality.  We strive for excellence through solid engineering practices and a high level of software education.  Coincidentally, Extreme Programming helps in this area, but Alt.Net does not specifically _mean_ XP.
  • Alternative Vendors
    A  common theme in many .Net shops is they are “Microsoft” shops.  In other words, if it doesn’t come from Microsoft, they won’t use it.  This makes no sense.  Microsoft is not a large enough company to be the vendor to the whole world.  .Net is a great platform, and we as a community have chosen the platform and choose to participate in this segment of the industry.  We strongly believe that 3rd party vendors complement the .Net platform in a way that can contribute to excellent working software.  In fact, some 3rd party offerings are superior to Microsoft’s offering.  For instance, in a strive for excellence in an e-commerce website, a team may choose a mature O/R Mapper like NHibernate to accelerate team speed and produce a flexible data layer; however, Alt.Net does not _mean_ ORM.  Open source software is a source of excellent 3rd party alternatives built on the .Net platform, and it should be used over Microsoft alternatives when they contribute to excellence; however, Alt.Net does not _mean_ open source.
  • Joy in our work
    We know that we will produce better software if a team is motivated and morale is high; therefore, we use libraries, tools, and practices that add joy to the working environment.  We abandon or correct libraries, tools, and practices that make it a drag to work on the software.  For instance, many find that Visual Studio is a bit slow to work with and that adding Resharper to the IDE adds a high level of joy while working with .Net code; however, Alt.Net does not _mean_ Resharper.
  • Knowledge
    We know that we will never know everything there is to know.  We strive for a greater understanding of software through studying successes and failures of the past as well as the present.  We educate ourselves through many sources in order to bring the most value to our clients.  We keep up with other platforms, such as Java and Ruby, so that we can apply their good ideas to .Net development and increase the quality of our .Net software.  The technology is always changing, but the knowledge accumulates.  We know that the knowledge applies no matter how the technology changes.  With knowledge comes humility because without humility, knowledge would pass us by.

The above are principles, so they are intentionally abstract.  Below, I’ll list some items that are concrete.  These items apply the principles but are more directly applicable:

  • Read more than just MSDN magazine and MS Press.  Authors like Feathers, Fowler, Martin, Evans, etc have a lot to give (Knowledge)
  • Use Resharper.  It makes working with Visual Studio a (Joy).  But if another vendor comes along that does even better than JetBrains, consider switching
  • Use NUnit  over MSTest,  Subversion over TFS SCC,  Infragistics/Telerik over in-the-box controls,  RedGate over in-the-box SQL tools.  Each of these is a better alternative to that which Microsoft provides (Alternative Vendors).  Use NHibernate over hand-rolled stored procedures and especially over DataAdapter/DataSet, but if EntityFramework proves to actually be superior to NHibernate in a meaningful way, consider using it.
  • Use a responsible application architecture.  Don’t put everything in Page_Load like you see demonstrated at MSDN Events.  Use knowledge to create an application that can stand the test of time and not be rewritten every 2 years.  Deliver (high quality and excellence). 
  • Automate every repetitive task; builds, tests, deployments, etc – excellence and joy

The concrete examples could go on and on, and I hope AltNetConf produces a long list.  I’ll be interested in having my proposed principles accepted by the community there or revised into something better.  Either way, I’d like to get to a point where there is an accepted definition of Alt.Net.

NHibernate: ICriteria may cause update – know about AutoFlush mode – level 300

This topic is for those already using NHibernate.  By looking at the forum, that is a whole load of people!

As always, my blog posts stem from experience, and this is no different.  It's been a year since I first tried out NHibernate, and since then I've used it on 4 large, unrelated applications.  This latest application of NHibernate is by far the most exciting, however, because we are able to take advantage of the full power of the library.  The others have always been tempered with the few things that couldn't be changed that hampered seamless data access.  My team no longer has to slow down to think about what SQL to write.  We stay in C#, and we're going faster and faster.  For the performance-minded, the NHibernate SQL is pretty darned fast (because there is nothing special about it – just CRUD).  We run about 120 database tests in 2.5 seconds – not bad.

Last week, I learned another thing new about NHibernate – AutoFlush mode.  This is important because NHibernate only keeps 1 instance of your domain object per ISession instance, so if you ask for the same object multiple times from the same ISession instance, you'll get the same domain object instance.  Imagine this scenario: 

  • You pull some objects into memory.
  • The user modifies one object.
  • You query for a list using ICriteria (the object the user modified is a match for this list)

What should the system do?  Should the fresh query refresh all the objects and throw away the user's changes?  NHibernate's default behavior is "no".  It is configured to "autoflush" by default.  When it detects that some changes might inadvertently be thrown away by a fresh query, it will auto update the modified object to the database.  If you open up SQL Profiler, look for UPDATE commands amidst SELECTs.  If you choose to set the autoflush mode to "NEVER", then you'll get a big fat exception, and you can write some code to handle the times when you need to do a fresh query after a persistent object has been modified.

Using enum strings with NHibernate persistence – level 400

One of the things that is not very obvious when using NHibernate is how to using Enumerations.  If you merely map an enum with the hbm.xml, NHibernate will persist that class member as an int, which is the underlying type of an Enum value type.  Even if you have your database table configured with a string field, you’ll get an integer in your string field.

To have NHibernate use the string representation of the enum for storing in the database, you need to use a special type class.  Below is a simple example of using the EnumStringType that NHibernate provides.  Consider the following enum that I want to use in my class (this is a very simplified example):

    public enum DeliveryStatus
    {
        Pending,
        Ready,
        Sent
    }

When mapping this in my class, NHibernate would persist 0, 1, and 2 in my database.  What I actually want are the strings to be stored in my table (no comments about normalization, please).  Here is a wierd thing you have to do to achieve this goal.  Here is my mapping:
<?xml version=”1.0″ encoding=”utf-8″ ?>
<hibernate-mapping xmlns=”urn:nhibernate-mapping-2.0″>
    <class name=”Palermo.Shipping.Shipment, Palermo.Shipping” table=”Shipment”>
        <id name=”TrackingId” column=”TrackingId” type=”String” length=”30″ >
            <generator class=”assigned”/>
        </id>
       
        <property name=”DeliveryState” column=”DeliveryState”
            type=”Palermo.Shipping.DeliveryStatusType, Palermo.Shipping”/>
       
    </class>

</hibernate-mapping>

Notice that I have a new type referenced for DeliveryState:  DeliveryStatusType.  This is what’s new.  This type helps NHibernate map an enum string instead of the int.  For this, I must define this type in my code:

    public class DeliveryStatusType : EnumStringType
    {
        public DeliveryStatusType() : base(typeof (DeliveryStatus), 30)
        {
        }
    }

Note that this is very simple, and the 30 specifies the max length of the enum string.  I’d recommend setting this number the same as the length of your string field in your database.

With these small steps, NHibernate will now map Pending, Ready, and Sent to the database field.  Normal programmatic interaction is the same.  NHibernate will take care of all the data access.  Without the above solution, one might be tempted to use string constants, but I’d highly recommend using enums when the possible values are known. 

Oracle-style joins in Sql server. There is no performance difference – level 300

I recently came across some sql code that caught me off guard.  Here is is:

select something
from table1, table2
where table1.id = table2.id

I immediately thought that a Cartesian product was happening and the rows were being filtered afterward.  All my database experience has been with Microsoft databases, so I didn’t know that this syntax used to be the way most people did sql..  Like any good engineer, I set out to find out for myself what was really going on.

I used the Northwind database to compare the following two queries:  the first with Sql server syntax, and the second with “old school” syntax (which Sql server 2000 suppports).
SELECT    *
FROM    Orders o
    INNER JOIN Customers c ON o.CustomerID = c.CustomerID
    INNER JOIN [Order Details] od ON o.OrderID = od.OrderID
    INNER JOIN Products p ON od.ProductID = p.ProductID

SELECT    *
FROM Customers c, Orders o, [Order Details] od, Products p
WHERE o.CustomerID = c.CustomerID
    AND o.OrderID = od.OrderID
    AND od.ProductID = p.ProductID

I ran these two queries many, many times together and in isolation, and I examined the execution plans, the client statistics as well as the Sql trace.  It appears that at a lower level, these two operations are identical.  Both queries took the same Duration, CPU cycles, and Reads to execute.  Here is the Execution plan.  Both queries have this same exact exectuion plan:

Personally, I like the INNER JOIN syntax.  It’s very explicit, and it’s easy to add RIGHT and LEFT to dictate OUTER joins.  A plus is that it is the ANSI standard and Microsoft’s recommendation for Sql Server. 

The objective conclusion of this experiment is that the style picked for a query will not affect the speed at which that query runs.  The differences are subjective.  My advice, however, is that a single style be adopted as part of the team’s coding standard. 

Ron Jacobs discusses NHibernate on the MS Patterns and Practices Arcast – level 200

The lates Arcast from Ron Jacobs is about NHibernate, and open-source Object-Relational mapping tool.  This podcast was particularly interesting to me because my team is using NHibernate for a project, and we are likely to use it for most data access going forward.  The data access tier is the target of many debates.  My team has weighed the pros and cons, assessed our security and performance requirements, and we have decided that using NHibernate to automate the persistence and loading of our business objects is the direction we want to go.  In has saved so many developer hours of writing boring CRUD sprocs and sql statements.  We use an xml file to map the properties of our business object to the database table, and we’re done.  We have a test-bed of integration tests to ensure that the mapping is correct.

Ron brought up an argument that some inside Microsoft have on OR mappers.  I’m paraphrasing, but this is the idea:  Developers may shoot themselves in the foot if Microsoft provides an OR mapper and endorses it. 

I’ll let that sink in.  I can’t remember a development tool that someone hasn’t managed to abuse.  Hmm.  That doesn’t seem to be a very strong argument.  You obviously don’t give an M1 Abrams tank to a novice, but in the hands of a trained professional soldier, it can be very effective.

Another point discussed was that you no longer have complete control over performance with NHibernate.  That’s true because you would be trusting the component to generate and run the sql for you.  You obviously don’t use the exact same tool for every job, and it was mentioned that the Amazon.com(s) of the world would need more control over data access than most enterprise applications.  Most internal enterprise applications only have a few hundred users (if that).  What performance do they really need?  Now, NHibernate is NOT slow, but  if you need to go 400 MPH instead of a measley 397MPH, then you have some pretty heavy traffic and strict performance requirements.  For the rest of applications, NHibernate probably offers more speed than required.

Another topic of great interest to me:  The cancelled Ojbect Spaces project.  It was mentioned that it was cancelled because of the DLinq project that was developing, and Microsoft didn’t want to have two models for OR mapping.  I’m not sure about the details of this, but it was mentioned on the show.

All in all, I like podcasts from Ron Jacobs.  He’s a great personality for a radio show, and he pulls in some interesting topics.