Quickly documenting public APIs using GhostDoc – level 200

Along with Resharper, the following class was created using a minimum number of keystrokes.  Basically, I typed the names of the fields, and shortcut the rest.  Consider the following class that I might want to have Xml documentation for:



public class Person


{


    private string _firstName;


    private string _lastName;


    private string _phoneNumber;


    private int _age;


 


    public string LastName


    {


        get { return _lastName; }


        set { _lastName = value; }


    }


 


    public string PhoneNumber


    {


        get { return _phoneNumber; }


        set { _phoneNumber = value; }


    }


 


    public int Age


    {


        get { return _age; }


        set { _age = value; }


    }


 


    public string FirstName


    {


        get { return _firstName; }


        set { _firstName = value; }


    }


 


    public Person(string firstName, string lastName, string phoneNumber, int age)


    {


        _firstName = firstName;


        _lastName = lastName;


        _phoneNumber = phoneNumber;


        _age = age;


    }


}


I could type all the Xml comments myself, but that would be considerable effort, and there isn’t anything special about this class.  Most people could guess what this API does, so why not have a tool help with the Xml comments.  I press CTRL + Q (my GhostDoc shortcut key) on each of my properties as well as the constructor, and I immediately have the following:



public class Person


{


    private string _firstName;


    private string _lastName;


    private string _phoneNumber;


    private int _age;


 


    /// <summary>


    /// Gets or sets the name of the last.


    /// </summary>


    /// <value>The name of the last.</value>


    public string LastName


    {


        get { return _lastName; }


        set { _lastName = value; }


    }


 


    /// <summary>


    /// Gets or sets the phone number.


    /// </summary>


    /// <value>The phone number.</value>


    public string PhoneNumber


    {


        get { return _phoneNumber; }


        set { _phoneNumber = value; }


    }


 


    /// <summary>


    /// Gets or sets the age.


    /// </summary>


    /// <value>The age.</value>


    public int Age


    {


        get { return _age; }


        set { _age = value; }


    }


 


    /// <summary>


    /// Gets or sets the name of the first.


    /// </summary>


    /// <value>The name of the first.</value>


    public string FirstName


    {


        get { return _firstName; }


        set { _firstName = value; }


    }


 


    /// <summary>


    /// Initializes a new instance of the <see cref=”Person”/> class.


    /// </summary>


    /// <param name=”firstName”>Name of the first.</param>


    /// <param name=”lastName”>Name of the last.</param>


    /// <param name=”phoneNumber”>The phone number.</param>


    /// <param name=”age”>The age.</param>


    public Person(string firstName, string lastName, string phoneNumber, int age)


    {


        _firstName = firstName;


        _lastName = lastName;


        _phoneNumber = phoneNumber;


        _age = age;


    }


}


Most of the work is done.  There is no way GhostDoc can read my mind, so I might have to make a few adjustments to the comments, but they are minimal.  Note that GhostDoc got confused on firstName and lastName, but that’s understandable.  It hits the rest dead-on:



public class Person


{


    private string _firstName;


    private string _lastName;


    private string _phoneNumber;


    private int _age;


 


    /// <summary>


    /// Gets or sets the last name.


    /// </summary>


    /// <value>The last name.</value>


    public string LastName


    {


        get { return _lastName; }


        set { _lastName = value; }


    }


 


    /// <summary>


    /// Gets or sets the phone number.


    /// </summary>


    /// <value>The phone number.</value>


    public string PhoneNumber


    {


        get { return _phoneNumber; }


        set { _phoneNumber = value; }


    }


 


    /// <summary>


    /// Gets or sets the age.


    /// </summary>


    /// <value>The age.</value>


    public int Age


    {


        get { return _age; }


        set { _age = value; }


    }


 


    /// <summary>


    /// Gets or sets the first name.


    /// </summary>


    /// <value>The first name.</value>


    public string FirstName


    {


        get { return _firstName; }


        set { _firstName = value; }


    }


 


    /// <summary>


    /// Initializes a new instance of the <see cref=”Person”/> class.


    /// </summary>


    /// <param name=”firstName”>First name.</param>


    /// <param name=”lastName”>Last name.</param>


    /// <param name=”phoneNumber”>The phone number.</param>


    /// <param name=”age”>The age.</param>


    public Person(string firstName, string lastName, string phoneNumber, int age)


    {


        _firstName = firstName;


        _lastName = lastName;


        _phoneNumber = phoneNumber;


        _age = age;


    }


}


Microsoft’s guidance for unit testing actually describes integration testing – level 200

I commend Microsoft for putting a big emphasis on automated testing.  This is a big step for the software industry as a whole to embrace automated testing.  The hardware industry has already discovered this and has a built-in self-test in almost every device.  Likewise, software should have built-in self-tests to ensure that everything is working.


Microsoft has provided Unit Testing and Generating Source Code for Unit Test Frameworks Using Visual Studio 2005 Team System.  In this article they describe how to unit test some code using Team System’s automated testing features.  The problem is that what is actually described is integration testing.  The tests call a BankAccount object and exercise some methods.  There is no mocking(stubbing) of BankAccount’s dependencies, so this cannot be a unit test.  If BankAccount actually works, it will have to communicate with some dependency unless this BankAccount keeps everything in memory.


Here are some recommended “unit tests” from the article:


  • Constructor Test—To make sure your object loads properly, with the correct information.
  • PositiveLoadScalarTest—To test the successful Load of a Customer that exists in the database.
  • NegativeLoadScalarTest—To test the unsuccessful Load of a Customer—that is, one that does not exist in the database.
  • PositiveLoadTest—To test the successful load of Customers, based on known data.
  • NegativeLoadTest—To test the unsuccessful load of Customers that do not exist in the database.
  • NegativeValidationTest—To make sure your validation logic is working correctly.

  • Clearly, the assumption is made that a database is in the testing stack.  This disqualifies the tests from being unit tests because they include a dependency and test more code than the “unit” – BankAccount component. 


    Automated testing is very important, but I must clarify some points in this article.  The testing described here is “integration testing”.  This kind of testing is important too, but it’s important to make the distinction.  Unit tests test at the unit level and exclude a database.  Unit tests are fast and can run on any workstation, not just a specially configured development environment.


    Integration tests include multiple components of a system and make sure the pieces work together.  In an integration test, it is appropriate to include multiple components including a database.  For unit testing, including a database is not appropriate.

    How to create a jump-to-page feature to enhance the DataGrid pager – level 200

    When displaying information in a grid on an ASP.NET web page, it’s
    often a good idea to page the information if there is a lot of
    it.  For instance, if you have 10000 rows of information to
    potentially display, it’s unreasonable to expect the user to wait to
    download all of it at one time and scroll forever.  It’s wasteful,
    too.  Instead, page it in smaller chunks (10-50).  Then let
    the user move to different pages of information.  But when there
    is a lot of pages, you don’t want the user to have to click multiple
    times to get to page 15.  If information is sorted, a user will
    often know what page to jump to.  In this
    post, I’ll cover how to add a “jump-to-page” feature to the
    paging mechanism of an ASP.NET DataGrid.
     
    First, let’s cover how to get paging working on a normal, stock
    DataGrid.  I’ve spoofed some data in the form of a string array,
    and I have my basic, numeric pager:
     
    <form id=”Form1″ method=”post” runat=”server”>
      <asp:DataGrid AllowPaging=”True” PagerStyle-Mode=”NumericPages” 
         PagerStyle-PageButtonCount=”10″ ID=”grid” Runat=”server” />
    </form>
     

    using System;

    using System.Web.UI;

    using System.Web.UI.WebControls;

     

    public class WebForm1 : Page

    {

        protected DataGrid grid;

     

        protected override void OnLoad(EventArgs e)

        {

            base.OnLoad(e);

            if (!IsPostBack)

            {

                grid.DataSource = this.GetSource();

                this.BindGrid(0);

            }

        }

     

        protected override void OnInit(EventArgs e)

        {

            base.OnInit(e);

            grid.PageIndexChanged += new DataGridPageChangedEventHandler(grid_PageIndexChanged);

        }

     

        private string[] GetSource()

        {

            //just imagine how you would get/cache your data source.

            string[] rows = new string[10000];

            for (int i = 0; i < 10000; i++)

            {

                rows[i] = i.ToString();

            }

            return rows;

        }

     

        private void BindGrid(int currentPageIndex)

        {

            grid.DataSource = this.GetSource();

            grid.CurrentPageIndex = currentPageIndex;

            grid.DataBind();

        }

     

        private void grid_PageIndexChanged(object source, DataGridPageChangedEventArgs e)

        {

            this.BindGrid(e.NewPageIndex);

        }

    }

    This works fine, but I have to click so many times to get to page
    100 (10 clicks).  I would like to be able to punch in a number and
    jump to that page quickly.  Let’s examine the following code and
    how I created some controls to insert into the DataGrid paging
    mechanism.  The DataGrid doesn’t provide a PageTemplate, so we
    have to do a bit of manual control arrangement.  First, we’ll set
    up how we would like it to work:

    Let’s add our jump-to section:

    <form id=”Form1″ method=”post” runat=”server”>
     <asp:DataGrid AllowPaging=”True” PagerStyle-Mode=”NumericPages”
      PagerStyle-PageButtonCount=”10″ ID=”grid” Runat=”server” />
     <asp:Panel ID=”pagerEnhancement” Runat=”server”>
      Jump to page:
      <asp:TextBox id=jumpToText Runat=”server” Width=”25″></asp:TextBox>
      <asp:Button id=jumpToButton Runat=”server” Text=”Go”></asp:Button>
     </asp:Panel>
    </form>

    Notice that we just have a textbox and a button.  We’ve added code to make this affect the DataGrid page:

    using System;

    using System.Web.UI;

    using System.Web.UI.WebControls;

     

    public class WebForm1 : Page

    {

        protected System.Web.UI.WebControls.TextBox jumpToText;

        protected System.Web.UI.WebControls.Button jumpToButton;

        protected System.Web.UI.WebControls.Panel pagerEnhancement;

        protected DataGrid grid;

     

        protected override void OnLoad(EventArgs e)

        {

            base.OnLoad(e);

            if (!IsPostBack)

            {

                grid.DataSource = this.GetSource();

                this.BindGrid(0);

            }

        }

     

        protected override void OnInit(EventArgs e)

        {

            base.OnInit(e);

            grid.PageIndexChanged += new DataGridPageChangedEventHandler(grid_PageIndexChanged);

            jumpToButton.Click +=new EventHandler(jumpToButton_Click);

        }

     

        private string[] GetSource()

        {

            //just imagine how you would get/cache your data source.

            string[] rows = new string[10000];

            for (int i = 0; i < 10000; i++)

            {

                rows[i] = i.ToString();

            }

            return rows;

        }

     

        private void BindGrid(int currentPageIndex)

        {

            grid.DataSource = this.GetSource();

            grid.CurrentPageIndex = currentPageIndex;

            grid.DataBind();

        }

     

        private void grid_PageIndexChanged(object source, DataGridPageChangedEventArgs e)

        {

            this.BindGrid(e.NewPageIndex);

        }

     

        private void jumpToButton_Click(object sender, System.EventArgs e)

        {

            int newPage = int.Parse(jumpToText.Text);

            this.BindGrid(newPage – 1); //The DataGrid needs a page index, zero-based.

        }

    }

    This works, but when I make my DataGrid pretty, I don’t want this
    paging attachment left out.  I need it to be inside the DataGrid,
    so let’s write some code to move it from outside the DataGrid to right
    next to the default pager:

    using System;

    using System.Web.UI;

    using System.Web.UI.WebControls;

     

    public class WebForm1 : Page

    {

        protected TextBox jumpToText;

        protected Button jumpToButton;

        protected Panel pagerEnhancement;

        protected DataGrid grid;

     

        protected override void OnLoad(EventArgs e)

        {

            base.OnLoad(e);

            if (!IsPostBack)

            {

                grid.DataSource = this.GetSource();

                this.BindGrid(0);

            }

        }

     

        protected override void OnInit(EventArgs e)

        {

            base.OnInit(e);

            grid.PageIndexChanged += new DataGridPageChangedEventHandler(grid_PageIndexChanged);

            jumpToButton.Click += new EventHandler(jumpToButton_Click);

            grid.ItemCreated +=new DataGridItemEventHandler(grid_ItemCreated);

        }

     

        private string[] GetSource()

        {

            //just imagine how you would get/cache your data source.

            string[] rows = new string[10000];

            for (int i = 0; i < 10000; i++)

            {

                rows[i] = i.ToString();

            }

            return rows;

        }

     

        private void BindGrid(int currentPageIndex)

        {

            grid.DataSource = this.GetSource();

            grid.CurrentPageIndex = currentPageIndex;

            grid.DataBind();

        }

     

        private void grid_PageIndexChanged(object source, DataGridPageChangedEventArgs e)

        {

            this.BindGrid(e.NewPageIndex);

        }

     

        private void jumpToButton_Click(object sender, EventArgs e)

        {

            int newPage = int.Parse(jumpToText.Text);

            this.BindGrid(newPage – 1); //The DataGrid needs a page index, zero-based.

        }

     

        private void grid_ItemCreated(object sender, DataGridItemEventArgs e)

        {

            if(e.Item.ItemType == ListItemType.Pager)

            {

                TableCell pagerCell = (TableCell) e.Item.Controls[0];// User Trace to view the

                //control tree and found out how to grab the pager.;

                pagerCell.Controls.Add(pagerEnhancement);

            }

        }

    }

    Notice how I am listening to the ItemCreated event of the DataGrid
    and then moving the control when I’m on the Pager.  I used the
    Trace feature to observe the DataGrid control to see how the Pager was
    laid out.  The DataGrid is rendered as an Html table, so I just
    needed to move my extra paging controls to the appropriate TableCell.

    Obviously, I have simplified this example to demonstrate the basics
    of adding to the default paging mechanism in the DataGrid.  When I
    use this technique, I make the appearance a lot more attractive, and I
    also validate that an actual page number was entered into the
    TextBox.  After all, I have to gracefully handle whatever the user
    types into that box. 

    Test-driven development panel at Innotech – level 000

    At the Innotech conference, there was a TDD panel with Mary Poppendieck and 4 members of the local developer community (2 of them are my coworkers).  Mary clarified that TDD is as old as software development itself.  She relates that when she was coding assembly, she would create a testing framework to simulate the device she was writing for before she wrote any code.  Then, every bit of code was written with automated tests.  Here is a picture of the panel.  From left to right there is Scott Bellware, Jeremy Miller, Jeff Smith, Bret Pettichord, and Mary Poppendieck.

    Attending, Innotech, a local Austin conference – level 000

    This Wednesday, I attended the Innotech conference (no relation to Office Space), where I attended a session given by Mary Poppendieck, the author of Lean Software Development.  She elaborated on some principal of making software development more lean.  It starts with eliminating waste.  At lunch, we convinced her to come out to a local burger place and had a great time discussing software and Agile over burgers.  Coincidentally, every attendee is a member of the AgileATX practitioners group.  Pictured, you’ll see Jeffrey Palermo (me), Blake Caraway, Steve Donie, Scott Bellware, Bret Pettichord, Phred Menyhert, Jeremy Miller, and Mary Poppendieck.

    How to set up a productive working environment for Agile teams – level 200

    In Agile Software Development, there is a strong focus on collaboration.  Sending emails back and forth just doesn’t cut it, and weekly meetings are enough.  Offices and cubicles can often stifle communication because there are physical barriers in the way of communication. 


    My team works in a single conference room with tables grouped together in the center.  Each person has a table as a desk, but all the tables touch and form a large rectangle (all the wires go in the middle).  When a conversation is going on, even non-participants can hear, and often a non-participant can correct a misconception or challenge an assumption.  Having everyone physically together all the time helps keep momentum going.  We have 96 sq feet of white board space, but we are getting some more so that every wall is covered in whiteboard – 288 sq ft. 


    Having everyone together also aids in pairing because pairing can be initiated at any time with great ease.  We use VNC for pairing, and that allows us to remain comfortable with a dedicated keyboard, mouse, and display.  The alternative would be rigging two of each to a single computer, and the setup and teardown of this configuration takes time.  Using VNC, we can share a workstation instantly. 


    Meetings should be able to be initiated at any time.  Don’t wait for a conference room to become available.  Any time any member is waiting on something, it’s a blocking scenario.  If someone is blocked, their time is being wasted, and the company is ultimately loosing money – work is hindered – productivity minimized.  Finding a meeting room is a blocker.  My team works _in_ a meeting room, so when we need to talk, we talk.  If we need to do some modeling, we just start drawing on the whiteboard.


    Many organizations have environments that directly kill productivity.  If you have to schedule meetings in advance. . . if you ever find a whiteboard moment and have to search for a marker. . . you are wasting productivity.  Most of the time, management thinks that more meeting space and ample office supplies would cost unnecessary money, but they forget that their biggest business expense is payroll.


    The IT infrastructure of the company can also be a blocker, and management is usually the only entity that has the authority to correct this:  there is often politics involved.  Organizations where server admins are shared among project teams are just asking for trouble.  Invariably, the server admin is always working “on the other guy’s project”.  A software product team can achieve maximum velocity while server resources aren’t adequate.  Other IT policies can get in the way as well:  If every interesting website on the Internet is blocked by a proxy, a valuable development resource is likely to be blocked as well.  Did you know that some software shops have _newgroups_ blocked! 


    Continue reading about this topic with an essay from AgileModeling.com on “Organizing an Agile Modeling Room

    How to use NMock to assist with unit testing – level 300

    When you unit test, it is important to mock dependencies.  You can use static mocks (which are really just dummy classes that implement your interface) or dynamic mocks which can save code some of the time.  NMock is one of the frameworks for dynamic mocks.  They work by creating a class instance at runtime to match your specifications.  You take the instance of this class and use it like any other object.  Below are several scenarios for using NMock to create runtime instances of an interface.  This also works for abstract classes and concrete classes with virtual members.


        1 using System;


        2 using System.Security.Principal;


        3 using NUnit.Framework;


        4 using NMock;


        5 


        6 [TestFixture]


        7 public class NMockDemo


        8 {


        9     [Test, ExpectedException(typeof(VerifyException))]


       10     public void FailsBecauseWrongThingWasPassedIn()


       11     {


       12         IMock mock = new DynamicMock(typeof(IPrincipal));


       13         mock.ExpectAndReturn(“IsInRole”, true, “AdminRole”);


       14 


       15         IPrincipal principal = (IPrincipal) mock.MockInstance;


       16         bool result = principal.IsInRole(“TechRole”);


       17 


       18         mock.Verify();


       19     }


       20 


       21     [Test, ExpectedException(typeof(VerifyException))]


       22     public void FailsBecauseMethodWasntCalled()


       23     {


       24         IMock mock = new DynamicMock(typeof(IPrincipal));


       25         mock.ExpectAndReturn(“IsInRole”, true, “AdminRole”);


       26 


       27         IPrincipal principal = (IPrincipal) mock.MockInstance;


       28 


       29         mock.Verify();


       30     }


       31 


       32     [Test]


       33     public void Suceeds()


       34     {


       35         IMock mock = new DynamicMock(typeof(IPrincipal));


       36         mock.ExpectAndReturn(“IsInRole”, true, “AdminRole”);


       37 


       38         IPrincipal principal = (IPrincipal) mock.MockInstance;


       39         bool result = principal.IsInRole(“AdminRole”);


       40 


       41         mock.Verify();


       42     }


       43 


       44     [Test]


       45     public void SetupResultThatCanBeCalledManyTimes()


       46     {


       47         IMock mock = new DynamicMock(typeof(IPrincipal));


       48         mock.SetupResult(“IsInRole”, true, typeof(string));


       49 


       50         IPrincipal principal = (IPrincipal) mock.MockInstance;


       51 


       52         mock.Verify();


       53     }


       54 


       55     [Test]


       56     public void HowToSetupMockPropertyWithNestedMock()


       57     {


       58         IMock mock = new DynamicMock(typeof(IPrincipal));


       59         mock.SetupResult(“IsInRole”, true, typeof(string));


       60 


       61         DynamicMock identityMock = new DynamicMock(typeof(IIdentity));


       62         identityMock.SetupResult(“Name”, “someUserName”);


       63         mock.SetupResult(“Identity”, identityMock.MockInstance);


       64 


       65         IPrincipal principal = (IPrincipal) mock.MockInstance;


       66         Console.WriteLine(principal.Identity.Name);


       67 


       68         mock.Verify();


       69     }


       70 }


    How to use SQL Server Express (where’s the UI?) – level 200

    If you are used to working with SQL Server 2000, you may expect to find a tool similar to Enterprise Manager and Query Analyzer.  SQL Express is a free product, and it shows, but you get an awful lot for free. . . but you don’t get it all.  I think that’s perfectly fine.  I’d rather have something for free than nothing.


    Here’s a few tips to get you started:



    • SQL Express installs as a local instance named “SQLEXPRESS”, so your connection string needs to include the instance name:  “.SQLEXPRESS” or “localhostSQLEXPRESS”.
    • SQL Server Configuration Manager isn’t the UI you want to add databases, tables, etc.
    • Use any of the Visual Studio Express products as your database UI.  They all have the database manager built in.

      • Use the Server Explorer window to add your database instance.  Then you can use it to add a new database and add tables to that database. 
      • If you have database create scripts, you can run them inside Visual Studio Express.  If you are used to hitting F5 in Query Analyzer, then you’ll want to map a shortcut key to the “Run Selection” command: Right click -> Run Selection.
      • You can create all your database objects here.
      • You can run and step through stored procedures for debugging.

    • You can also use osql.exe to manager your database.  This is useful when you want to automate database scripts using NAnt.
    • You have the option of how you want to connect to a SQL Express database:

      • Through the SqlClient provider: Data Source=localhostsqlexpress;Initial Catalog=MyNewTestDatabase;Integrated Security=True;Pooling=False
      • Through a file reference: Data Source=.SQLEXPRESS;AttachDbFilename=C:opensvndevelopmentezwebtrunksrcwebsiteApp_DataASPNETDB.MDF;Integrated Security=True;User Instance=True
      • If using ASP.NET, you already have a connection string you can use: LocalSqlServer:  Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory|ASPNETDB.MDF;Integrated Security=True;User Instance=True

    See http://www.aspfaq.com/sql2005/show.asp?id=3 for more information.  There are other, non-Microsoft, tools that can be used to manager SQL Express as well.


    UPDATE:  There is a November 2005 CTP of SQL Server 2005 Express Management Studio available.  This version is being linked to right off the SQL Server express download page.