The Monday’s show comes to Austin – level -999


Carl Franklin and the crew are recording the Monday’s show.  There is a live audience (not shown) here at Dave & Buster’s.  They’re staying in town all weekend, so they’ll have some time to relax and catch up on sleep.  We’re also going to see some Austin live music and grill some rib-eye steaks tomorrow.

Mark Miller has found several more people dumber than him, and a new character comes to the show.  And, of course, Richard has some cool new toys to talk about.
I wonder how long it will take Geoff to edit _this_ episode.

EZWeb 2.0 released – ready for your ASP.NET 2.0 host – level 100

ASP.NET 2.0 is here, and so is my version of EZWeb to go along with it.  It makes it super easy to take advantage of master pages and themes with your own website without having to code it yourself.  Use EZWeb to run a website of any size.


For my 1.x releases, I used my custom Theming and master-page-like controls, but now with the release of the .Net Framework 2.0, I’ve refactored these features to use the built-in features of ASP.NET 2.0.  You can build a standard master page as a template for your EZWeb website, and you can use Themes (.skin and .css files) to modify the look of your website.  This release comes with a few templates, but you can easily customize your site with your own master page.  Download it now and give it a shot on any ASP.NET 2.0 host.


Download EZWeb 2.0 here.



Visual Studio 2005 Pro install very trouble-free – level 100

I pulled down the VS 2005 bits from MSDN overnight, and installed today.  I had the Beta 2 bits on my machine along with VS 2003.  I uninstalled manually everything that Beta 2 put on my machine and ran the VS 2005 installer.  It terminated with a message that I missed MSXML 6.0.  I got rid of that, and reran setup.  It completed successfully in under an hour!  Beta 1 and Beta 2 all took several hours (I had them installed on my workstation as well).  Kudos to Microsoft for the improved install time.  I installed everything except Crystal Reports.  I even do a little J# dabbling every once in a while.  Java was my first OO language.


Now I can banish my Release Candidate and get to work with the real thing.  I already have a running list of workarounds for things that I did a little differently in VS 2003.  My use cases are definitely not typical MSDN demo fodder, and I often have to banish the IDE in order to take advantage of a runtime-only feature, so I’m trodding off the “happy path”. 


DiscountASP will be one of the first to offer ASP.NET 2.0 hosting.  I’ll be trying them out.

How to solve the ever-common “Parser Error Message: Access is denied: ‘mydll’.” – level 200

We’ve all been halted in ASP.NET development when we seemingly do a normal compile, and then our website won’t work.  It won’t even start up.  Here’s a rough error you may see:
————————————————————————


Server Error in ‘/MyWebApp’ Application


Configuration Error


Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately.

Parser Error Message: Access is denied: ‘mydll’.
Source Error:

Line 169:   <add assembly=<System.Drawing, Version=1.0.3300.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a>/>
Line 170: <add assembly=<System.EnterpriseServices, Version=1.0.3300.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a>/>
Line 171: <add assembly=<*>/>
Line 172: </assemblies>
Line 173:

Source File: c:winntmicrosoft.netframeworkv1.0.3705Configmachine.config Line: 171
Version Information: Microsoft .NET Framework Version:1.0.3705.0; ASP.NET Version:1.0.3705.0
———————————————————-


Most of the time, your website worked just fine.  It has happened to me when I’ve just made an update to the web.config file.  What causes this is that the website AppDomain tries to load, and it can’t.  A change to the web.config causes a recycle of the AppDomain and a reload of the website.  When the website loads, a copy of every assembly is loaded into the Temporary ASP.NET Files directory.  If that directory is locked, or files in the directory are locked, this process can’t complete, and we’ll see this error.


The most common cause is the Windows Indexing Service.  I don’t really need this service, so I’ve disabled it.  I don’t get this error anymore.  Technically speaking, any process that may lock files in that directory may cause this error, so technically virus scanners, Google Desktop search, etc may lock those files and cause this error, but I’ve only connected this with the Windows Indexing Service.


Microsoft, in this Knowledge Base (KB) article, recommends exactly that: disabling the Indexing Service.

ASP.NET Rendered Controls vs. Composite Controls – I prefer to avoid rendered controls – level 200

Here, I’ll present a simple example of the same control using the rendered method and the composite method.  For those new to custom web controls, a rendered control is a class inherited from “Control” or “WebControl” that overrides the “Render” method in order to sned custom markup to the browser.  A composite control uses other controls in a hierarchy and leaves the rendering to each control. 


Here is a simple rendered control:



    6     public class RenderedControl : Control


    7     {


    8         protected override void Render(HtmlTextWriter writer)


    9         {


   10             writer.AddAttribute(HtmlTextWriterAttribute.Id, “myID”);


   11             writer.RenderBeginTag(HtmlTextWriterTag.Div);


   12             writer.AddAttribute(HtmlTextWriterAttribute.Id, “innerID”);


   13             writer.RenderBeginTag(HtmlTextWriterTag.Span);


   14             writer.Write(“This text is the meat of this control.”);


   15             writer.RenderEndTag();


   16             writer.RenderEndTag();


   17         }


   18     }


It’s pretty straight-forward, and the output will always be this: 



   18 <div id=”myID”>


   19     <span id=”innerID”>This text is the meat of this control.</span>


   20 </div>


Here is the same thing using the composite method:



    6     public class CompositeControl : Panel


    7     {


    8         protected override void CreateChildControls()


    9         {


   10             base.CreateChildControls ();


   11 


   12             this.ID = “myID”;


 


   13             Label lbl = new Label();


   14             lbl.ID = “innerID”;


   15             lbl.Text = “This text is the meat of this control.”;


 


   16             this.Controls.Add(lbl);


   17         }


   18     }


With the composite method, the developer doesn’t have to worry about the low level markup being rendered, and he can depend on the richer object-oriented contract of each child control.  With this method, the developer overrides the CreateChildControls() method and adds child controls.  All these child controls will render themselves.  The exact same markup renders.


In my opinion, using the composite control method optimizes development time, and that is the most expensive more often than not.  I don’t give weight to the performance argument because the difference in performance is on the millisecond scale.  The moment you call out-of-process to a database or the file system, you accept a performance hit of much more magnitude; therefore, the performance difference of these two methods is negligible.  The composite control method is more maintainable, and maintenance of an application always costs more.

Visual Studio Team System “unit?” testing, integration testing and DDT – Development-Driven Testing – level 200

Dr. Neil calls Microsoft out for their attempt to redefine Test-Driven Development to suite the product they have developed (Team System).  I personally thing VSTS will help a great number of teams define a process for their group.  Many development team I talk to don’t have an actual process.  It’s more of the wild west.  I use a mixture of XP an Scrum, and our Agile practices are very structured and disciplined.  Waterfall falls somewhere in the middle on a discipline scale (and far out there on a reality scale).  For cowboy teams or teams whose process isn’t reliably producing working software, I think VSTS will help to inforce a more disciplined and accountable software shop.  Along with any product, it will be both abused and used for good.  Kudos to Microsoft for putting this product together.


With all the good that it will bring with it’s increased focus on automated builds, static code analysis and automated testing, it has missed the boat on a few items regarding it’s automated testing support or what it claims to be.  VSTS does come with testing support, and it can generate test stubs from production code, but it does not inherently support test-drivent development.  TDD is simple:



  • RED (make a test, and make it fail)

  • GREEN (write production code to make the test pass)

  • REFACTOR (eliminate duplication)

It’s a cycle of 1 test to 1 production scenario, and TDD is test-first unit testing that helps shape the design of the production code as it emerges.  Development is “test-driven”.  That means that if you are developing and then generating tests, you are doing testing, but your development wasn’t “test-driven”, it was “test-trailored”, or whatever term you want to use.


VSTS testing tools will generate test stubs from production code, but they won’t generate a production method stub from a test (which is a well-used feature of Resharper).  If you look at the process guidence on TDD from Microsoft, you’ll notice the attempt to redefine the process to fit the tool.  They should modify to the tool to fit the already-proven process.


Another falacy exists in the guidance on creating “ASP.NET unit tests”.  Microsoft seems to only focus on unit testing as a term, but they describe integration tests.  Refer to Jeremy Miller’s post about the different types of tests.  Unit tests test code in isolation.  In other words, they test code while isolating dependencies from the code.  Unit tests should always pass.  Unit test only run custom code.  If 3rd-party libaries or frameworks are required for a test to pass, then it is, by definition, not a unit test.  It then becomes an integration test.  The term “ASP.NET unit test” is a falacy.  These tests, as Microsoft describes them, include the development webserver in the testing stack.  This is a pretty significant dependency, and these are integration tests.  Integration tests are important too, but they should be defined correctly.

I agree with James Shore about Continuous Integration, but I disagree about the value of Cruise Control – level 200

James Shore writes that “Continuous Integration is an Attitude, Not a Tool“.  I agree with first half of his post.  Continuous integration _is_ a shared understanding among team members that all commits accompany a full integration and commits happen very often.  I disagree, however, with the second half where he slams Cruise Control.  Mr. Shore contends that CC is a crutch, and if you use it, “. . .it’s a clear sign that you have opportunities to improve”.  I don’t disagree that my team has opportunities to improve, but the mere use of CC doesn’t cause this fact to be true.  I agree with every one of his ideas:



  • Team members ensure the build always works.

  • When we get the latest code from the repository, it will always build successfully and pass all tests.

  • We will check in our code every two to four hours.

  • Cruise Control is a traffic cop that tells you when you broke the build

Mr. Shore’s main point is that you don’t need the traffic cop if you don’t break the build.  I envy Mr. Shore if his team has never experienced a broken build.  He makes a false assumption about users of CC.  He assumes that users of CC, unlike his team, don’t run the build locally before committing.  If his assumption were correct, then CC would always be the entity that reports problems. 


My team’s routine is similar to Mr. Shore’s, but with the inclusion of CC.  Before committing to the source code repository, we update to merge in the latest code.  Next, we run the full build locally, and we make our commit ONLY if it builds.  CC then is just an automated, independent build that should always pass.  A broken build at the CC level is _always_ a surprise.  Speed of the build is important, and it should constantly be improved.  We focus on build speed since it is run locally as well as by CC.


CC may just be a traffic cop, but it saves time because it automates the steps of going to a different server to do an integration build, and it notifies early if one of us makes a mistake.  Hopefully we don’t have to depend on Cruise Control (since our build should always work anyway) or the debugger (since the code should always work too).  They may be used infrequently (both CC and the debugger), but they are valuable tools to have available.

Low disk space: Windows will stop working correctly – level 100

Here a little tip from experience:  If a server starts acting funny, check the disk space.  If you are almost out of space, Windows won’t be able to function correctly.  Symptoms of this may be slow time to remote desktop in, or a host of other “weird” things that you are unable to explain.  In fact, some .Net applications may fail to run with a “Can’t find XXX assembly” error.  The assembly is where it’s supposed to be, but if .Net can’t allocate space for the shadow copy, then it can’t start the application (yes, I’ve had this happen to me).


The more free space the better, and if the server is approaching the megabytes of free space on the main drive, ask your IT people to correct it before trying to do anything with that server.  You’ll waste your time if you really need the server for something, and it won’t cooporate.  You can beg and plead as much as you want, but a server without free disk space won’t care that you have a major deadline rapidly approaching.