Archive for the 'Visual Studio 2005' Category

Hudson – Continuous Integration Testing

For a while now we’ve been planning on making use of Hudson to help us maintain working builds in our source code configuration system.

What Hudson does, is build your software at a predefined schedule (like a SQL Server agent job, or Windows Scheduled task) or when requested by a user, and produces a dashboard showing the status of your software builds. It is highly configurable, and can link to many source code control systems to get the latest version of your latest code.

Even though this product was/is/seems to be (I’m not sure on this) aimed at the Java developers of this world, it can build pretty much any software package.

We intend to make use of it for the building of .NET applications by getting it to call MSBUILD via DOS Batch files (in my example), however I will point out that you can get Hudson to call MSBUILD through an MSBUILD plug-in not covered in this post.

I want to use another plug-in, the ‘Text Finder’ plug in to parse the output of the batch file for StyleCop errors and system build failures.

Please have a read of this: Meet Hudson, so you can get a proper introduction to it.

To be perfectly honest, I’m not a Java fan (as my colleagues will tell you ;-))… and I thought it was a little overkill to have to install a Java Runtime and Apache, and Hudson to perform automatic builds… however, it has been relatively painless… and as my organisation is not exactly liking to splash to cash at the moment, the fact that everything is free helps…

Those risk adverse, like myself, have to put aside the fact that it’s all open source… and the only support out there is through volunteers :-)… there’s no metaphorical stick to hit someone with if the product doesn’t ‘just work’ out of the open source box.

What I downloaded

  1. Java Runtime Environment (jre-6u14-windows-i586)
  2. Tomcat 5.5 (apache-tomcat-5.5.27)
  3. Hudson (hudson.war v1.314)
  4. Text Finder plug in (text-finder.hpi v1.7)

NB: If you are using IE8/7 to download hudson.war and text-finder.hpi you may find that the extension is changed to .zip. You will need to rename the files back to their original extension to get the installation to work correctly.

Setting up Hudson on Windows 2003/2008 Server alongside IIS

Obviously, since apache tomcat and Hudson are Java based applications, if you don’t have a Java Runtime environment (JRE), you’ll need to install that first. It is relatively painless, simply double click on the EXE installer and bob’s your uncle.

I had my doubts about getting Tomcat 5.5 working alongside IIS, but to my surprise (I haven’t touched Tomcat since my University days back in 2001) the installer worked well, and defaulted to using port 8080, which is nice, since I don’t want to be getting in the way of IIS on port 80. The installer *should* also detect if you’ve got a JRE installed and set the link to it up automatically. If it doesn’t, simply point the installer at the folder that you installed the JRE. You need to set an administrator account up to access the Tomcat manager.

The Tomcat application server should be alive (possibly following a reboot in certain circumstances) after the install and you should be able to navigate to it’s front page.

Figure 1: The tomcat front page

imageFigure 2: Entering tomcat manager

Open the Tomcat manager to instigate the Hudson installation. It will prompt you for the username and password you set up during the Tomcat installation before you can access the page.

Scroll down to the Deploy section, in the ‘WAR file to deploy’ section click browse, and select the hudson.war file we downloaded earlier.

Figure 3: Deploying the Web Archive (WAR) for Hudson

Click the deploy button and the application should be deployed. If you put the WAR file in a part of the file system that gives a long file path, e.g. ‘C:\longfilepath\longfilepath\longfilepath\longfilepath\longfile\hudson.war’ you may have issues with the deployment. I certainly encountered this issue last week. The error message you will get isn’t the most useful, so it’s worth moving it to the root of a drive to see if that solves it.

To confirm successful deployment, look at the application list

Figure 4: the application list, showing Hudson

If you click on the hyperlink ‘/hudson’ it should take you to the front page of the Hudson application.

Figure 5: The Hudson ‘dashboard’

You are now ready to go… as you might have noticed I’ve already created a Job - ‘Test 001’. This is the build that I’ve set up to hopefully explain to you as part of this post.

As I’m using the ‘Text Finder’ plug-in, you’ll now need to install that if you want to follow my example.

Figure 6: Managing Hudson, and adding a plug-in

Click ‘Manage Hudson’ and then on ‘Manage Plugins’, Click the ‘Advanced’ tab and scroll to the bottom of that page so you see the following:

Figure 7: Uploading a plug-in

If now click the upload button, when it has finished, restart the Tomcat service. If you don’t perform a restart the plug-in wont be shown as installed.

Figure 8: list of installed plug-ins

Once installed, you should see it in the list of installed plug-ins.

We can now go about creating the job that will build the .NET application.

I’ve got a really simple .NET 3.5 website application (it does nothing other than to display default.aspx) that I’m using for this post.

Figure 9: Visual Studio 2008 Web Application, the working folder on E: drive and the batch file in the root of the web application folder.

The batch file that Hudson will call is very simple, and I suspect it could be done better, however, here it is if you want to make use of it:

echo change directory to visual studio 2008 common tools folder
cd ../..
cd VC
echo set environment variables
call vcvarsall.bat;

echo call Test001.csproj (looks in the directory of this batch file for it)
call msbuild %~dp0Test001.csproj

Navigate to the Hudson ‘dashboard’/front page. And click ‘New Job’.

Provide Hudson with a name for your job, and select ‘Build a free-style software project’.

Figure 10: Free style software project selection

Figure 11: Adding build steps

Leave everything else as standard for now, and click ‘Add build step’ and select the ‘Execute windows batch command’ option.

Enter the path to the batch file (as shown in Figure 12)

Figure 12: entering the batch file details into build step

The next step is to configure the ‘text finder’ plug-in to look for the token ‘FAIL’, since MSBUILD produces messages with the word ‘FAIL’ in them.

Figure 13: configuring Hudson to look for the token ‘FAIL’ in the console output.

Click the Save button, and your job has been created!

Navigate back to the Hudson dashboard, and click the ‘build’ icon next to Job ‘Test 001’ (as shown in Figure 14)

Figure 14: Instigate a build

If the build was successful, when you refresh the page, you should see this:

Figure 15: The sunny picture indicates a very stable build

To demonstrate how Hudson picks up on failed builds, I’m now going to rename the code behind page for default.aspx from ‘default.aspx.cs’ to  ‘breakbuild.aspx.cs’.

Figure 16: Deliberately breaking the build

Using Hudson, run the job again.

Figure 16: The cloudy picture indicates a failure has occurred

The job has failed, the more the job fails, the worse the weather gets :-)

Run it a few more times to get more bleak weather (unless you like thunderstorms).

Figure 17: thunderstorms indicate that most recent builds have all failed

You can review the console output of all the builds that have taken place to help you diagnose failed builds.

Figure 18: review console of failed builds

As you can imagine, with the text finder plug-in and the numerous others available for Hudson, it makes it a very powerful tool.

I intend to set ours up so it will notify the development team when the latest version of a system checked into our source control system will not build, or contains StyleCop warnings.


InfoPath and SharePoint verses ASP.NET and a Traditional Database verses ASP.NET and using SharePoint as a database technology

I was recently asked by a colleague

“I’ve got to build a new application to support x (an anonymous set of requirements that I cannot divulge here!), I’ve not got long to do it, and my developer resources are thin on the ground. I’ve heard you talk about SharePoint and InfoPath, and need to call on your experience, do you think I could develop my application using those two technologies? It requires a complex interface layer and needs to be able to provide neat looking reports.”

Okay I said, I’ll give you my experiences in the form of some potential solutions and potential pros and cons. I realise by posting this I’m likely to anger the gods and provoke some really large debate… but that was my plan all along :-)


So your decision basically is between three development strategies/options

  1. InfoPath and SharePoint 2007 (MOSS)
  2. ASP.NET and MOSS
  3. ASP.NET and SQL Server 2005

This means the first step is to consider the requirements for the interface layer (IL)… ask yourself: will the user want to do anything fancy on the front end? e.g. sorting data grids, combo boxes, interface with external system. If the answer to that is yes, then you’ll probably want to consider an ASP.NET front end.

If the user really only requires a simple form, then InfoPath is a good choice for the IL… but to make the waters even more murky you’ll need to consider the storage/reporting requirements as InfoPath on it’s own will only offer XML based storage, either on disk, email or SharePoint forms library. ASP.NET forms are more flexible and can enable you to store the data in a SharePoint list, database or if you really wanted, and XML file.

InfoPath pros and cons

  • Forms can be produced by pretty much anyone with no training
  • Simple to build prototypes (quick and cheap)
  • Easy for user’s to use and understand
  • Allows offline editing (by saving the form to local hard drive)
  • Doesn’t need to be designed in detail before development can be started


  • Which version of InfoPath does your corporate desktop/laptop build support? InfoPath 2003 is getting a little tired now (this means it’s old, and wont support newer controls, and will limit the ‘code behind’ that you can produce)
  • InfoPath does not allow you to build flexible, custom interfaces
  • Can’t reuse rules from other forms without having to recreate them
  • Rules are difficult to navigate/debug
  • Difficult to migrate (without reworking the forms)
  • If used in conjunction with an SharePoint form library, the coupling is very tight, so if you move the site/rename it you might have to alter the form

ASP.NET pros and cons

  • Can do whatever you like (within reason) as you have access to .NET 3.5. [this includes things like sending email etc.]
  • Can produce flexible interfaces
  • Easy to debug using Visual Studio
  • Can reuse code and layouts using classes and master pages
  • Can interface with SharePoint, SQL Server, Oracle, XML and lots of other ODBC compliant technologies


  • Requires that the developers have ASP.NET training
  • Prototypes take longer to build than in InfoPath
  • Does not allow offline use, without extensive development of a side by side offline system
  • Users may require training if something is ’specialised’
  • You need to design the pages (if you want a sensible solution)

You can also have a read of my blog:, where I’ve linked to Jason Apergis’ blog who explains the pros and cons in a workflow context, but he decides that InfoPath is better for his organisation.

Now you can compare traditional databases and SharePoint

SharePoint pros and cons

  • Easy to build sites and site collections (quick and cheapish)
  • Has plethora of web parts that can be dragged and dropped by novice users to create dynamic content
  • Links well with InfoPath
  • List items can be produced via MOSS API and Web Services from other technologies such as ASP.NET
  • Sites can be generated through the MOSS API
  • Does rudimentary version control (albeit not in the best possible way… perhaps this isn’t a pro after all :-))
  • Can create production level sites/storage facilities without a detailed design


  • It should not be used like a traditional database (… and can’t really be used like one either as it can’t do joins between lists)
  • Difficult to report from MOSS lists and libraries, although you can used Reporting Services to query lists it is generally more difficult compared to SQL queries
  • Uses lots of hard drive space (the MOSS database grows quite large)
  • It is not straight forward to migrate from a dev environment to a live environment

Traditional Database (e.g. SQL Server 2005)

  • Very flexible
  • Can use proper joins, sorts
  • Links very well with Reporting Services to produce powerful outputs
  • Links very well with ASP.NET and other .NET technologies


  • Requires a detailed design (or not… but don’t do that to yourself!)
  • Can’t be used directly with InfoPath
  • Requires a production and dev server in an ideal world

Okay, so if you read between the lines… I think you should go for options 2 or 3… preferably 3.

The perception is that as its quick and cheap to use InfoPath and SharePoint… and that perception is right 90% of the way…. You’ll find that once you’ve done 90%… The last 10% will take you an absolute age, and will probably consist of workarounds, squirming out of meeting requirements and swearing at the computer.

The decision is yours, so be pragmatic, and assess the requirements in front of you, and ask difficult questions to try to ascertain whether any potential requirements creep puts you in the ASP.NET frame or the InfoPath frame. If reporting is a major player, I would urge you to think about using SQL Server and Reporting Services.

I hope this has helped you a little bit anyway, good luck :-)


Potential Issues when deploying a SmartObject that integrates with the SharePoint ServiceObject

Why am I writing this particular article… well I had this problem and wasted the best part of a day on trying to solve it…. and it wasn’t exactly straight forward.

This issue only occurs when the following are true:

  1. You have created a SmartObject that interacts with the SharePoint ServiceObject, e.g. calls the service object to get a list of users that belong to a SharePoint group
  2. You build your deployment package on your development server and install on your destination server via the console i.e. using msbuild rather than using Visual Studio to do the deployment
  3. You have SharePoint (MOSS) installed on your development server, and on your destination server
  4. You have K2 BlackPearl server installed on your development server and on your destination server

How you know you’ve got this issue:

  1. You will receive errors when running msbuild, with the error messages saying that it cannot find your SharePoint URL even though you know it exists on the destination server
  2. Any SmartObjects that interact with SharePoint will fail to deploy

Why is there an issue you ask… well it’s relatively simple. When you interact with the SharePoint service object the K2 Service Broker looks for the GUID representing your SharePoint instance… when you deploy to another server where another copy of SharePoint is installed, this GUID will be different, so the deploy will fail.

To explain in more detail, follow steps below

Figure 1: The Service Broker Management application

On your development server: Navigate to the install folder for K2 [blackpearl] and find the BrokerManagement.exe (Shown in Figure 1). Execute the BrokerManagement application.

Figure 2: the K2 [blackpearl] broker management application front end

Click on ‘Configure Services’ as shown in Figure 2.

Figure 3: Finding your SharePoint instance GUID

You will now need to look for your SharePoint instance GUID. I’ve shown an example of this in Figure 3.

If you repeat this process on the destination server you’ll see that the SharePoint instances will have differing GUID values.

The dirty fix:

Do a search and replace (in the files created by the deployment package) on the GUID from the development server with the one on the destination server to ensure correct deployment using msbuild

I believe that following this issue the K2 Labs are currently working on a fix for this issue.


How to debug (in real-time) a K2 [blackpearl] workflow using Visual Studio breakpoints

Now why would you ever need to go to the trouble of debugging the code that a wizard produced or to validate some logic in this long winded fashion I hear you asking…..

Well, when you’ve pulled your hair out for too long and the K2 support ticket guys are asking difficult questions this is definitely the way to look more deeply into the goings on in your K2 [blackpearl] workflow.

Figure 1: Attach to process option 

The first thing you need to do is to fire up Visual Studio (K2 Designer) and go to a piece of code that you want to debug. I’ve selected a code event that I’ve created myself, but there’s no reason why you can’t stick a breakpoint inside a wizard. This is shown in Figure 1, as is the attach to process option that you’ll need to use to kick off the debugger.

Figure 2: Attaching to the K2HostServer.exe process

Once you’ve clicked the attach to process option you’ll need to find the K2HostServer.exe process (as shown in Figure 2) and select it.

Don’t click the Attach button yet.

Figure 3: Ensuring ‘Managed Code’ is selected

Make sure the ‘Attach to’ option only says Managed Code as shown in Figure 3. You can change this using the ‘Select’ option on the attach to process dialog.

Figure 4: Security Warning message

You might see the security warning shown in Figure 4, or an error message if you don’t have rights to attach to the K2HostServer.exe process. You probably need to be at least an administrator on the server. Ignore the security warning and click ‘attach’ to carry on.

Figure 5: Visual Studio Breakpoint warning after attachment

Now, you’ll now see Visual Studio frantically loading assemblies in the background - this is normal. You’ll also notice that your breakpoint now displays an error similar to that shown in Figure 5. This is okay, it’s just Visual Studio warning us that our code is not yet in context - you need to initiate your workflow and get to the point in the workflow where your breakpoint exists before it will be a valid breakpoint.

Figure 6: The breakpoint is now active! 

Figure 6 shows an active breakpoint as the workflow has reached the part of the code!

You can now use the watch window to look at variables.


Debugging K2 [blackpearl] using the K2 Server Console

I’ve been trying to help the guys a K2 diagnose a problem with my workflow and it seems one of the trade secrets at doing this stuff is it is to put a ‘Console.WriteLn’ (Standard C# syntax) statement in the relevant place and use the K2 [blackpearl] server console to view the output produced.

So the first thing you need to do is to select a workflow event that you want to debug.

Figure 1: selecting an event to debug in Visual Studio

Figure 1 shows how to get to the C# code. You’ll need to identify which part of the code to put the ‘Console.WriteLn’.

Figure 2:  The K2 [blackpearl] server console

Next you’ll want to see the K2 [blackpearl] server console - to do this

  1. Stop the K2 [blackpearl] server service using the management console services page
  2. Run the K2 [blackpearl] server by navigating to the start menu and right clicking on the shortcut to the K2 [blackpearl] server.
  3. Select the ‘Run As…’ option (you can also locate the executable and do as described: here)
  4. Enter the credentials for the K2 [blackpearl] service account that you specified when installing K2 [blackpearl] on your server
  5. You should now see a window similar to that shown in Figure 2.

If you fail to give the correct credentials, or try to run as the ‘current user’ (i.e. You!) the server will probably not start and you’ll get an error message briefly, and then the server console will close.

Note: I am assuming a development server setup, where the K2 [blackpearl] server is local to the machine that I am running Visual Studio on, if your K2 [blackpearl] server is running on another machine you’ll need to run the K2 [blackpearl] console on that particular machine.

Figure 3: Example of a debug statement inside a workflow event 

Figure 3 shows some example debug code, and if you look carefully, you’ll see my debug being produced in Figure 2! :-D

The K2 [blackpearl] server console will be very useful to me in the future, and it can be used to diagnose failed deployments, authentication (Kerberos) issues and workflow logic issues (if you know what you are looking for…. but it’s not rocket science).


Getting the file name of an InfoPath attachment

There will be a time, let me tell you :-) that you’ll want or need the file name of the document that one of your users has attached to your InfoPath form.

You’d think that this functionality would come out of the box… but no!

There’s no easy way of doing this, as I’ve had to knock up some simple C# code to go behind my InfoPath forms. See below:

    public void RetrieveFilename()
            // TODO: Verify correct XPath for this InfoPath form
            IXMLDOMNode opnXN =

            byte[] attachmentNodeBytes =

            // Position 20 contains a DWORD indicating the length of the
            // filename buffer. The filename is stored as Unicode so the
            // length is multiplied by 2.

            int fnLength = attachmentNodeBytes[20] * 2;
            byte[] fnBytes =new byte[fnLength];

            // The actual filename starts at position 24 . . . 
            for (int i = 0; i < fnBytes.Length; i++)
                fnBytes[i] = attachmentNodeBytes[24 + i];

            // Convert the filename bytes to a string. The string 
            // terminates with \0 so the actual filename is the 
            // original filename minus the last character ! 

            char[] charFileName =
            string fileName =new string(charFileName);
            fileName = fileName.Substring(0, fileName.Length - 1);        

            // TODO: Verify correct XPath for this InfoPath form
                = fileName;

            // The file is located after the header, which is 24 bytes
            // plus the length of the filename. 
            byte[] fileContents = 
             new byte[attachmentNodeBytes.Length - (24 + fnLength)];
            for (int i = 0; i < fileContents.Length; ++i)
            fileContents[i] = attachmentNodeBytes[24 + fnLength + i];

            // TODO: Add correct MOSS site URL
            string SiteURL = "http://my/sites/newse/newdoclib/" 
               + fileName;
            SPWeb site = new SPSite(SiteURL).OpenWeb();
            site.Files.Add(SiteURL, fileContents);


        catch (Exception e)// TODO: Write better error handler

        // The following function handler is created by Microsoft Office. 
        // Do not modify the type or number of arguments.
        [InfoPathEventHandler(MatchPath = "/my:myFields/my:attachedDoc",
           EventType = InfoPathEventType.OnAfterChange)]
        public void attachedDoc_OnAfterChange(DataDOMEvent e)
            // Write your code here to restore the global state.

            if (e.IsUndoRedo)
                // An undo or redo operation has occurred 
                // and the DOM is read-only.

            // A field change has occurred and the DOM is writable.
            // Write your code here.

Thanks should go to Pranab Paul for providing me the initial idea for this code snippet. Also note that you’ll need to add the various using statements for the XPath and XML classes to be available.


K2 [blackpearl] - the solution to JIT Debugger error and the mystery "Error Receiving Buffer"

It would seem, the chaps at K2 know about the problem, and it’s a little featurette (if such as word exists).

So if you are as unlucky as me to stumble upon this issue there are three things to check

  1. Your project might contain a corrupt SmartObject in which case you will have to identify and recreate it…
  2. The (kprx) visual studio project file might contain a closing tag on a seperate line, in which case you will have to open the *.kprx file in Notepad (or another text editor) and make the changes in the DataField nodes, where it is most likely for this oddity to occur. The most common reason for this issue is entering a description for an object in the project that contains a trailing carriage return
  3. The Windows Network Load Balancing service (NLB) is incorrectly set/installed or configured and reinstalling this component may fix the issue.

So… which one of these fixed it for me…. well Figure 1 illustrates this nicely. When you make the alteration, you may find that visual studio managed to put back the offending carriage return, if this happens, delete the item you have the description on, and recreate it without a trailing carriage return and all should be well.

Figure 1: Trailing Carriage Return leading to Serialization issues when deploying to the K2 Server

I hope this helps some of you guys out as it’s a tricky one


Starting K2 [blackpearl] Workflow using a SharePoint new item added event

I’ve got a workflow that initiated when a document is added to a library, it seems to work fine when a “new” document is created in the library, but fails to work when a document is “uploaded”.

My assumption is, (it could be a wrong assumption) because the error is related to missing meta data (Figure 3) that I use later in the workflow (as set up by the user when a document is created or uploaded to the library), that meta data is handled differently by K2 or SharePoint depending on whether an upload or new item added event has taken place.

As there is no particular event to catch when a document is uploaded, how should I be handling this situation?

I’ve selected “Item Added” as this seems to work with new documents (as shown in Figure 2) and it does kick off the workflow when a document is uploaded. 

What should I select to cope with uploaded documents?

Figure 1: The Workflow and SharePoint Event integration

Figure 2: Inside the SharePoint Event integration wizard

Figure 3: The Error message in a later workflow item requiring document meta data (the destination user)

Again, thanks for any advice you can provide me :-)


See for the K2 Underground post on this issue

K2 [blackpearl] JIT Debugger Error

Some background on this would probably help.

I’ve been working on this workflow for a few weeks now, and reached a limitation of either SharePoint, K2 or myself in that I couldn’t think of a nice way to convert a username e.g. SDSRESTROOM\MGALLEN to an email address.

I can’t use Active Directory for this, as I’m dealing with SharePoint users, so the dev team here came up with some simple .NET code running as a K2 server event that does a crude mapping of username to email addresses - this has it’s limitations though as we’ve not been able to get it to work with SharePoint groups yet. I don’t think the code is the problem though, as it seems to build okay, and it works as a .NET class library outside the K2 Development Environment.

Figure 1: Application Event Log

Figure 1 illustrates the error I’m getting when I try to deploy my workflow. Unhandled excepton occurred in K2HostServer.exe (System.Runtime.Serialization.SerializationException)

The interesting bit of the error is that it informs me that the “Debugger could not be started because no user is logged on”

Figure 2: The Workflow event that I added to cause chaos! 

A Server Event wizard, with some C# code behind that accesses the SharePoint site using the Microsoft.SharePoint assembly - if you want to do this yourself, you need to click on the project assembly icon in the K2 Designer and add this to your project - you can’t do it as if you were developing traditional SharePoint web part of C# class libraries.

It also hooks into some Data Fields that i’ve created through the K2 Object Browser.

Figure 3: The Code and the error at the bottom of the picture

“Error Receiving Buffer” is the error I get, which lead me to the event viewer output in Figure 1.

Any pearls (no pun intended) of wisdom for me on this?

Many thanks,


K2 [blackpearl] and the missing SharePoint site

My company uses K2 [blackpearl] for producing business logic in the form of Workflows.

K2 [blackpearl] for those not in the know is a product that sits on top of MOSS2007, SQL Server 2005 and Windows Workflow Foundation. It enables developers, and power users the ability to construct workflow using Microsoft Office Visio 2007, Visual Studio and SharePoint Designer.

I’ve already posted this exact topic on K2 Underground - the unofficial support site for all the folk using K2 [blackpearl] and K2.Net (the earlier equivalent version of K2 [blackpearl])


The screenshot above shows my exact problem. The K2 Object Browser in Visual Studio 2005 is supposed to show all sites registered on my portal (moss). It however, doesn’t seem to pick up my CMMi site. It’s a mystery to me… i’ve tried refreshing everthing you could think of.. follow my progress on this on K2 Underground.


Stop Press!

It appears that the use of the word ‘Workflow’ in a site name makes it invisible on the K2 Object Browser - the K2 Dev Team are on the case :-)

You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.