Archive for the 'SharePoint' Category

InfoPath forms with file attachments: one potential solution, for all InfoPath forms?

InfoPath is a pain in the backside to me and my colleagues. Our users knock up a variety of forms and then ask us to support them when they go wrong, or don’t do what they need. Just to add a little more to that, we only have InfoPath 2003 on the client machines, which means we are working with legacy forms. Currently, our production systems run SharePoint 2007.

We had one such problem arrive on my desk a few weeks back. It seemed that a form was put together by another department that had around 20 (yes twenty!) file attachment controls on it. Each instance of this form would be a ‘case’, that would be under version control in it’s destination form library. I was shown some examples, and we had some InfoPath forms reaching upwards of 100MBs in size. We weren’t all that surprised that the more remote locations that my company occupies had issues reading these forms over our aging network infrastructure. They also have thousands of cases, so we would be looking at version controlling large files, and lots of them. We needed to avoid this, just to ensure we are being sensible with our backups!!

How to fix this? Well, you could add code to the InfoPath form template that uploads attached documents to a SharePoint document library. An excellent approach is devised here: How to upload a document to SharePoint from an InfoPath form. There are numerous others dotted around the internet similar to this… but what if you want to leave the novice users get on with it, without having to train them on how to hack code behind the templates?

I think there may be a solution to this, but believe me, it makes your eyes water when you look at how we have achieved it. It does look like an abomination… but it works, so I thought I’d blog about it and get some opinion from others who may have this issue in their organisation.

Why this solution is different:

  1. No code is required behind the InfoPath template
  2. Works with InfoPath 2003/2007 (untested with 2010)
  3. Works with any InfoPath template with only minor modifications required (as explained in this article)
  4. Uses the HTTP submission method
  5. Destination of file attachments is dynamic, users can alter this using standard InfoPath rules
  6. It requires a custom web handler (deployed to the GAC, deployed as a solution in the SharePoint farm)
  7. Removes attached files and replaces them with a URL file, to allow the user to click on it and open the file from the SharePoint document library
  8. I’ve only tried this in SharePoint 2007, but it should work with SharePoint 2010.
  9. This is a deploy and forget solution – no costly maintenance overhead, just some very simple guidance.

Known limitations:

  1. Only works with HTTP submit, with the built in submit button (if you know how to remove this limitation let me know!)
  2. Doesn’t permit folders inside the Form Library [it will always submit to the root folder]
  3. Users are told that they are likely to be opening a virus when they click on a URL link inside InfoPath – annoying!

Before I begin, acknowledgements must go to:

  • My fat blog – validation of allowed file extensions
  • Adis Jugo – validation of filenames to ensure SharePoint compatibility
  • Microsoft KB892730 – decoding, and encoding InfoPath attachments
  • Cyanwerks – creating URL files using the URL file format
  • … the man with the plan [who came up with this hair brained scheme in the first place, because it wasn’t me!]

The rest of the code is purely draft, non-production level stuff.

How it works:

  1. InfoPath form is submitted to a web handler [I have coined it the InfoPath file extractor (IFE)] (ashx) that is running in the SharePoint application pool with full trust
  2. InfoPath form uses query string arguments to tell it where to look for fields required.
  3. InfoPath form has a field group and a number of field to configure the extraction process
  4. Web handler traverses the submitted InfoPath for looking for base64 encoded attachments
  5. Each attachment is decoded and uploaded to SharePoint Document Library (with folder creation) based upon the configuration set in InfoPath
  6. Each attachment is replaced with a URL file linking to the uploaded file
  7. InfoPath form (with attachments replaced with URL files) is then persisted in a form library

Running Example [code available to download at the end of this article]:

First, use stsadm to deploy the wsp file (it generates this when you build it in Visual Studio, see the post build events, by default it uses c:\wsp at the destination). As shown in Figure 1

stsadm -o addsolution -filename c:\wsp\ife.wsp

imageFigure 1 – STSADM successful deployment 54

Add the solution to your web farm using SharePoint Central Administration. When you open Solution Management, it will be shown as not deployed. See Figure 2.

imageFigure 2 – Solution not yet deployed

Click on ife.wsp. And then deploy it to the all content web applications (or perhaps you wish to be selective?)

 imageFigure 3 – Deploy the solution

The web handler has now been deployed to your web farm. The code will now be in the GAC, and the web handler access will be from the 12 hive: TEMPLATE\LAYOUTS\SC\DEMO\SERVICES\IFE.ashx for all websites.

An example URL would be: http://moss/_LAYOUTS/SC/DEMO/SERVICES/IFE.ashx

Create a demo SharePoint site. Create it with one document library and one form library. I used a blank site as the template for this example and did the rest by hand.

My example demo site looks like:

imageFigure 4 – Demo site

I need to reference these URLs later in this working example:

The site URL is:  http://moss/ifetest
Form library URL is: http://moss/ifetest/FormLibrary
Document library URL is: http://moss/ifetest/ExtractedFiles

Now create an InfoPath form…

I used the Asset Tracking example form (as supplied with InfoPath 2007) and slightly modified it to include a file attachment control in the repeating group.

image Figure 5 – updated the Asset Tracking example form to include file attachment

Now modify the submit options for the form by selecting submit options from the tools menu (as shown in Figure 6)

imageFigure 6 – Updating submit options

The IFE handler requires two query string arguments

  • RootGroup = which is the root field in the form. For the asset tracking example, this is ‘assetTracking’
  • IFEGroup = which is the root field for the IFE configuration fields. In this worked example, I am going to call this root field ‘IFE’.

This means the HTTP submission will look like: http://moss/_LAYOUTS/SC/DEMO/SERVICES/IFE.ashx?RootGroup=assetTracking&IFEGroup=IFE

Now we need to set up the configuration fields in the InfoPath form. Create a new group. As shown in Figure 7.  We call this IFE, as this is how we defined it in the query string.

image Figure 7 – Creating IFE group of field for configuration of the web handler.

In this group we now need to set up a number of fields. These fields should match those in your configuration. The ones here are only set to match my server.

  • SiteURL – SharePoint Site URL
    image
  • FormLibraryURL – InfoPath form library URL
    image
  • DocumentLibraryURL – SharePoint Document library URL
    image
  • DocumentLibraryFolder – the folder inside the document library where extracted files will go (I’ve concatenated two fields together from the Asset Tracking form)
    image
  • InfoPathSaveAsFileName – the name that the form will be called (I’ve concatenated two fields together, and added ‘.xml’ to the end for it to be a valid InfoPath filename.
    image
  • OverwriteFiles – whether the handler should overwrite existing files if they have the same name. This is a boolean string {“True” or “False”}.
    image

You then set the value by setting the ‘default value’ on the InfoPath forms. Fields such as DocumentLibraryFolder and InfoPathSaveAsFileName should be dynamically driven by content on the InfoPath form, as shown above.

You should now have a InfoPath data source like this:image
Figure 8 – The asset tracking Form, with the configuration fields in place

You can now demo this setup by previewing the form.

imageFigure 9 – Preview the form

Enter a name, and department (since these a now required to produce the folder and InfoPath save filename). Add an asset with a file attachment, and hit the submit button, as shown in figure 10.

image  
Figure 10 – form filled in

When submit is hit, you may get a security warning. Hit ‘Yes’ to continue if you do.

imageFigure 11 – Ignore the security warning

When the form is submitted, you should see a dialog like the one in Figure 12.

 image
Figure 12 – successful submission dialog

If you select the ‘Show Details’ button, you will see feedback messages from the web handler. As shown in Figure 13.

image
Figure 13 - Feedback window

Now lets see if it worked!

The form library should have updated with the new form, and the document library should have a new folder, and an extracted file!

image

Figure 14 – Form library will our new form!

image
Figure 15 – Document library with new folder

image
Figure 16 – Extracted document now exists in the folder

In order for forms to be updated from the form library, we now need to publish it to the Form Library in the standard way.

image
Figure 17 – Publish the form to the Form Library

We can now click on the form we created earlier to reveal the attachments have been replaced with URL files.

image 
Figure 18 – click on form in form library

image
Figure 19 – Document has been replaced with a link to the document in the Document Library!!!

The web handler will handle updates to this form by ignoring the already converted attachments, meaning we have created a self managing solution that will stop the InfoPath forms from becoming an unmanageable size, and we don’t even have to train the users to write code behind their forms!

Lets update the form with a few more different attached files.

image
Figure 20 - Adding more assets to the asset tracking form means more opportunities to add file attachments

And the Document Library now looks like:

image
Figure 21 – more extracted files

OK, so you now want to give it a go for yourself:

Download the source code: here.

If you do use this in your production environment, or make improvements/modifications to it, I’d like to hear from you! Also, please credit my involvement if it has saved you time, money or effort :-)

I make it available free of charge, with no warranty provided or implied. If you can’t work out how to build/deploy it, please ask on the various forums around the internet – I am not a SharePoint complete beginner support service :-D

Enjoy.

SC



Massive SharePoint config database log file

Its a common problem that many of us have. You set up your SharePoint farm, and its been running sweet as a nut, till you get asked one difficult question by the server team… why is the SharePoint configuration database log file so large?

SharePoint 2007 creates its databases in FULL RECOVERY mode. This is presumably because the SharePoint central administration pages were going to offer a more comprehensive recovery option, however I’ve only ever seen a full database restore. However, even though the built in GUI doesn’t make the use of the FULL RECOVERY model, I don’t think we should discount it.

There are two (nice) ways of coping with the log growth issue;

  1. You use your own backup strategy [NB this means you can’t use the built in restore option anymore], e.g. Scheduled SQL Job to backup the databases and logs yourself. When you do this, you have the option to use the WITH TRUNCATE ONLY option, or…
  2. Use the built in SharePoint backup/restore service from Central Admin, and schedule log truncations at defined intervals. (see below snippet of code shamelessly stolen from the Microsoft DBCC Shrink file tutorial)

– Example truncating a config db logfile to 70 MB):
BACKUP LOG [Sharepoint_Config] TO DISK=’drive:\path\Yourbackupfile.bak’
GO
BACKUP LOG [Sharepoint_Config] WITH TRUNCATE_ONLY
USE [SharePoint_Config]
GO
DBCC SHRINKFILE (N’SharePoint_Config_log’ , 70)

– Note: the 70 is the 70MB.  
GO

There is an excellent article on Death By Patterns website about log file management. This gives you an overview of why logs grow, and how they work in practice.

It should be noted that in more than a few places on the net, people recommend changing the database mode from FULL to SIMPLE in order to perform a DBCC SHRINKFILE. Whilst this is a simple and effective solution it takes away a crucial service that administrators can perform on a live SharePoint farm, if something goes wrong during the day, before a backup takes place, you have the capability of doing a in place transaction rollback.

See Sherin George’s blog for an excellent overview of the recovery models available in SQL Server.

See Server Fault discussion on SharePoint recovery models.

SC



IT in 2010… is it going to be as we predicted it?

With the new year almost upon us, I’ve been trying to theorise what skills will be required of developers in the next four years.

It is important to take stock of what you’ve got at the end of each year to ensure you offer training and support to the development team to help ensure they are productive with new technologies and tools.

As a developer myself, do I need to worry about keeping my job… ?

Ellen Fanning from Computer World, back in 2006, predicted outsourcing and the need to be business savvy was a major threat to IT workers.

John Newton from the CIO weblog (also from his own blog), back in 2007, predicted that content management would be improved and delivered in more human friendly ways. Business computing would shift to Blackberry type devices. User Interface design would be improved and take ideas from the gaming market.

There are a number of existing/emerging technologies that will impact my organisation in the near future (… I’m well aware that we are behind the curve on most of these, but give us a chance, and we’ll try to keep up :-)). Those are

  1. SharePoint 2010
  2. .NET 4.0
  3. ASP.NET Model View Controller 2
  4. jQuery
  5. Silverlight
  6. Windows Presentation Foundation

I’m sure there are many others.

We’ll also be trying to maximise productivity with our existing tools, such as the K2 BlackPearl / BlackPoint workflow suites.

It is has been a difficult time for many IT workers, when a business looks at what it can cut out of the budget, it usually means laying off staff, or the reduction of investment in their IT systems. Hopefully we can take heed with an article from Judi Hasson, Fiece CIO, who writes that IT is the key to recession recovery. Lets hope so!

Merry Christmas and a happy new year to all :-D

SC



Delivery of an important project, thanks to some clever shortcuts

This blog has been quiet for the last couple of weeks because I’ve been leading a development team in the final stages of delivery of a pretty large asset/stock tracking system.

Its a long story as to why we decided to do it all ‘in house’ rather than to buy a COTS product. Lets just say, we’d already tried buying off the shelf and that’s the reason why we started doing our own.

The budget was very challenging and the development team grew large in the space of a few weeks when we realised that the estimated man hours required to do it was going to take more than four full time developers.

Communication

With a large development team, in disparate locations around the UK it is important to get the communication channels right. My organisation had rolled out text based communication similar to Live Messenger (Office Communicator). This proved the first clever shortcut. This tool allowed us to collaborate in real time, using virtual whiteboards.

WAN based software configuration management. This was very important, it meant we could all see the same page, and nothing was stuck on Fred’s laptop when he went on leave :-)

Hudson – continuous integration testing, ensured that we never dirtied our source code, so we didn’t hear any of the usual ‘why does it not build when I check it out?’ type crying from the team. I really liked the fact you could schedule builds and output the successful build number to SharePoint using it’s RSS feeds. This little gem saved the bacon many a time… although until the developers got used to not treating our source code control system as a backup for their work in progress, the automatic emails telling everyone (including the PM) that the build failed was a little annoying ;-)… I loved it, as there was no excuse for checking in ‘work in progress’  :-D

SharePoint 2007 for project documentation. For much the same reason why we had WAN based software configuration management. SharePoint helps anyone on the internal intranet see what we are up to … and the project manager can stick the GANTT and meeting minutes somewhere too :-)

Process

Those of you who have read my blog before will remember my ramblings about which software development methodology should you choose. We chose to use OpenUP, which is the open source version of the Rational Unified Process. I was a little sceptical about its adoption in the embryonic stages of the project, but it certainly helped us stay on course. It also ensured we followed a process that industry knows about. The iterative nature of these kinds of processes ensure that higher management get to see what us technical types are up to, which keeps the pressure off, because at the end of each iteration there is a defined output and testing stratagem.

Technology

We had to keep it all web based, so it could work over the intranet… and we were limited to IE6 because my organisation hates upgrading and is very risk adverse. I can hear you all groaning… but what about all the security patches etc. etc… yeah I know… I’ve sat across the table from the infrastructure guys and have tried to explain that, but we are only developing for our internal intranet… what could go wrong with that ;-) … basically it is out of my pay bracket to argue with the people that make that kind of decision.

As my department is well versed in C# and ASP.NET (3.5) this was our chosen technology base… we paired this with SQL Server, as we are also well versed in that.

During the design phase it was glaringly obvious that the standard AJAX toolkit combined with the standard ASP.NET controls was not going to cut the mustard with the requirements that had been elicited… and we needed rapid application development, and slick results.

We turned to Telerik’s ASP.NET AJAX control library and it saved us a small fortune. I’m sceptical about most third party products… and I was very sceptical about using it in such a ‘flagship’ project, however my fears were unfounded. It seems you can sell a product with these controls in, and you get all the source code for them… and you can modify the source code for them providing you let Telerik know what you did. They also provide an excellent forum for dealing with problems. If you use a search engine and look up ‘RadGrid’ the Telerik equivalent of a GridView or DataGrid, you still get hundreds of hits :-)

We used the windows style dialog manager (RadWindow and RadAlert), menu control (RadMenu), grid view (RadGrid), combo box (RadComboBox), numeric only text boxes with extras (RadNumericTextBox) and many of their date pickers to name a few. Not only did we use them, we also used them alongside existing AJAX and standard .NET controls with no issue. I think if we did not buy this toolset, we would have spent hundreds of hours doing what they have already done for us, and I bet ours wouldn’t have been as slick, or as tested as theirs. The development team and I weren’t sad that they’d taken the raw control building away from us (and I thought it would incite a riot by the coffee machine!). One of the most pleasing aspects about it is that the customer is happy that we have delivered a sophisticated interface that is user friendly. The window manager allows users to resize, drag and minimise functionality on our web pages much the same as they do on their windows desktop machines.

Our next cost saving came came with the ReportViewer control. We’d been bitten badly in previous projects when using Reporting Services. Is it just me, or does it not feel like a finished product? Problems we had in the past were cross domain access / the classic double hop problem (impersonation) and unstructured website navigation (that one was probably more our fault). The ReportViewer control gives us a little more freedom, we can construct dynamic object data sources in our business layer, and create reports on the fly, on the actual page rather than firing off our requests to the Reporting Services engine running on another site collection in IIS. This eliminates authentication issues and allows you to put the report that is generated anywhere on your page. Our user’s like it a lot :-).

SQL Server Integration Services (SSIS) is looking like it will save us money once we’ve worked out how to use it properly ;-). The COTS product that we’re replacing has a database backend that we can connect to using SSIS, and the customer is providing us data in excel. This is a useful took to help shape and data cleanse on the way into the new system

Final thoughts

User training is now underway and the User acceptance test is around the corner, and I think it will be a big success for the team.

SpittingCAML



SharePoint 2010 – get a sneaky peak!

SC



SharePoint 2010 – 64bit support only

The preliminary system requirements have been released on the SharePoint Team Blog: here

The key points are:

  • SharePoint Server 2010 will be 64-bit only.
  • SharePoint Server 2010 will require 64-bit Windows Server 2008 or 64-bit Windows Server 2008 R2.
  • SharePoint Server 2010 will require 64-bit SQL Server 2008 or 64-bit SQL Server 2005.

So, what can you do today to get into the best shape for SharePoint Server 2010?

  1. Start by ensuring new hardware is 64-bit.  Deploying 64-bit is our current best practice recommendation for SharePoint 2007.
  2. Deploy Service Pack 2 and take a good look at the SharePoint 2010 Upgrade Checker that’s shipped as part of the update.  The Upgrade Checker will scan your SharePoint Server 2007 deployment for many issues that could affect a future upgrade to SharePoint 2010.
  3. Get to know Windows Server 2008 with SharePoint 2007, this post is a great starting point.
  4. Consider your desktop browser strategy if you have large population of Internet Explorer 6 users.
  5. Continue to follow the Best Practices guidance for SharePoint Server 2007.
  6. Keep an eye on this blog for updates and more details in the coming months.

It might be an expensive migration for my organisation as server real estate is getting a little old now, and I’m unsure on whether they’d support 64bit Windows. Something to get an early grasp of!

SpittingCAML



InfoPath and SharePoint verses ASP.NET and a Traditional Database verses ASP.NET and using SharePoint as a database technology

I was recently asked by a colleague

“I’ve got to build a new application to support x (an anonymous set of requirements that I cannot divulge here!), I’ve not got long to do it, and my developer resources are thin on the ground. I’ve heard you talk about SharePoint and InfoPath, and need to call on your experience, do you think I could develop my application using those two technologies? It requires a complex interface layer and needs to be able to provide neat looking reports.”

Okay I said, I’ll give you my experiences in the form of some potential solutions and potential pros and cons. I realise by posting this I’m likely to anger the gods and provoke some really large debate… but that was my plan all along :-)

 

So your decision basically is between three development strategies/options

  1. InfoPath and SharePoint 2007 (MOSS)
  2. ASP.NET and MOSS
  3. ASP.NET and SQL Server 2005

This means the first step is to consider the requirements for the interface layer (IL)… ask yourself: will the user want to do anything fancy on the front end? e.g. sorting data grids, combo boxes, interface with external system. If the answer to that is yes, then you’ll probably want to consider an ASP.NET front end.

If the user really only requires a simple form, then InfoPath is a good choice for the IL… but to make the waters even more murky you’ll need to consider the storage/reporting requirements as InfoPath on it’s own will only offer XML based storage, either on disk, email or SharePoint forms library. ASP.NET forms are more flexible and can enable you to store the data in a SharePoint list, database or if you really wanted, and XML file.

InfoPath pros and cons
Pros

  • Forms can be produced by pretty much anyone with no training
  • Simple to build prototypes (quick and cheap)
  • Easy for user’s to use and understand
  • Allows offline editing (by saving the form to local hard drive)
  • Doesn’t need to be designed in detail before development can be started

Cons

  • Which version of InfoPath does your corporate desktop/laptop build support? InfoPath 2003 is getting a little tired now (this means it’s old, and wont support newer controls, and will limit the ‘code behind’ that you can produce)
  • InfoPath does not allow you to build flexible, custom interfaces
  • Can’t reuse rules from other forms without having to recreate them
  • Rules are difficult to navigate/debug
  • Difficult to migrate (without reworking the forms)
  • If used in conjunction with an SharePoint form library, the coupling is very tight, so if you move the site/rename it you might have to alter the form

ASP.NET pros and cons
Pros

  • Can do whatever you like (within reason) as you have access to .NET 3.5. [this includes things like sending email etc.]
  • Can produce flexible interfaces
  • Easy to debug using Visual Studio
  • Can reuse code and layouts using classes and master pages
  • Can interface with SharePoint, SQL Server, Oracle, XML and lots of other ODBC compliant technologies

Cons

  • Requires that the developers have ASP.NET training
  • Prototypes take longer to build than in InfoPath
  • Does not allow offline use, without extensive development of a side by side offline system
  • Users may require training if something is ’specialised’
  • You need to design the pages (if you want a sensible solution)

You can also have a read of my blog: http://blog.mgallen.com/?p=206, where I’ve linked to Jason Apergis’ blog who explains the pros and cons in a workflow context, but he decides that InfoPath is better for his organisation.

Now you can compare traditional databases and SharePoint

SharePoint pros and cons
Pros

  • Easy to build sites and site collections (quick and cheapish)
  • Has plethora of web parts that can be dragged and dropped by novice users to create dynamic content
  • Links well with InfoPath
  • List items can be produced via MOSS API and Web Services from other technologies such as ASP.NET
  • Sites can be generated through the MOSS API
  • Does rudimentary version control (albeit not in the best possible way… perhaps this isn’t a pro after all :-))
  • Can create production level sites/storage facilities without a detailed design

Cons

  • It should not be used like a traditional database (… and can’t really be used like one either as it can’t do joins between lists)
  • Difficult to report from MOSS lists and libraries, although you can used Reporting Services to query lists it is generally more difficult compared to SQL queries
  • Uses lots of hard drive space (the MOSS database grows quite large)
  • It is not straight forward to migrate from a dev environment to a live environment

Traditional Database (e.g. SQL Server 2005)
Pros

  • Very flexible
  • Can use proper joins, sorts
  • Links very well with Reporting Services to produce powerful outputs
  • Links very well with ASP.NET and other .NET technologies

Cons

  • Requires a detailed design (or not… but don’t do that to yourself!)
  • Can’t be used directly with InfoPath
  • Requires a production and dev server in an ideal world

Okay, so if you read between the lines… I think you should go for options 2 or 3… preferably 3.

The perception is that as its quick and cheap to use InfoPath and SharePoint… and that perception is right 90% of the way…. You’ll find that once you’ve done 90%… The last 10% will take you an absolute age, and will probably consist of workarounds, squirming out of meeting requirements and swearing at the computer.

The decision is yours, so be pragmatic, and assess the requirements in front of you, and ask difficult questions to try to ascertain whether any potential requirements creep puts you in the ASP.NET frame or the InfoPath frame. If reporting is a major player, I would urge you to think about using SQL Server and Reporting Services.

I hope this has helped you a little bit anyway, good luck :-)

SpittingCAML



K2 [BlackPoint] - A simple meeting agenda review process in InfoPath… that can even store the completed InfoPath form

Well, I say simple… It is simple to me, as I’ve knocked it up using the Meeting Agenda template (free with InfoPath 2007) with a slight addition of a task action field, and I’ve then knocked together a simple approval and rework based workflow.

k2blackpointdemoprocess_001 
Figure 1: The finished workflow (I’m not going to teach you guys how to suck eggs by going through the creation of it… unless you’d like me too … another time perhaps?)

Figure 1 shows the finished article… I’m not suggesting that this is perfect example of an approval and rework workflow…  it is simply here to demonstrate a point.

k2blackpointdemoprocess_002
Figure 2: Destination Form Library for the creation of the InfoPath form that starts the workflow

I’m using a standard MOSS 2007 form library to store my InfoPath forms as shown in Figure 2. When ‘New’ is clicked, you see Figure 3.

k2blackpointdemoprocess_003
Figure 3: The InfoPath form created when ‘New’ is clicked in Form Library (FormLibrary001)

k2blackpointdemoprocess_004  
Figure 4: The K2 Worklist (which just happens to be in the Process Portal (K2 Management)) showing the new task for destination user ‘Administrator’

As you can see in Figure 4, a task is added to the administrator user task list.

k2blackpointdemoprocess_005 
Figure 5: Clicking on Open will take the user back into InfoPath, or the Actions option will allow the user to action the task bypassing InfoPath - this might be useful if you want people to bulk action things. In this example I should really disallow this option. You can do this as part of the InfoPath client event outcomes.

k2blackpointdemoprocess_006
Figure 6: Shows the opening of the InfoPath form from the K2 BlackPoint runtime services… I’ve configured this on port 8080 (just so you know). This gives me a clue that unlike K2 BlackPearl, InfoPath forms only exist in K2 Server, and not in the form library… which confused me somewhat… or perhaps I’ve got the wrong end of the stick on that.

k2blackpointdemoprocess_007 
Figure 7:  User is able to select an outcome from the task action field as you’d expect

k2blackpointdemoprocess_008
Figure 8: Shows the final resting place of the InfoPath form - this is because of the SharePoint document event I placed in the final activity. I assumed the form would be saved by default, but it doesn’t seem to work that way… but it certainly did in K2 BlackPearl.

So how did I get the InfoPath form to be stored in the Library at the end… well the next screen shots and explanations will tell you how! I must admit, I thought it would do this automatically, but both the Beta 2 and RC versions of K2 [BlackPoint] behave in this way…

As you may have noticed in Figure 1, I added a SharePoint document event to the end of my workflow. This is also shown in Figure 9.

k2blackpointdemoprocess_009
Figure 9: Shows the SharePoint document event on the final activity that saves the InfoPath form. I’m using the Upload Document Event Action.

k2blackpointdemoprocess_010
Figure 10: Shows that I’m going to be creating this uploaded file from a K2 Field. This is very important, as K2 BlackPoint is now being told that the file stream is not from a disk, but from the database

k2blackpointdemoprocess_011
Figure 11: When selecting the K2 Field, be sure to select the Root node of your InfoPath form. The Root node will be the name of your Template (in most cases). My Template XSN was called MeetingAgenda001.xsn before it was integrated into the workflow.

k2blackpointdemoprocess_012
Figure 12: I am building the Filename of the output file using the meetingTitle field from my InfoPath template

k2blackpointdemoprocess_013
Figure 13: To ensure an unique filename, I utilise the Process Instance ID

k2blackpointdemoprocess_014
Figure 14: Be sure to space out your dynamic fields, I’ve used a hyphen. Also be sure to add the extension to the end. For InfoPath, the extension required is ‘.xml’ (well, of course it is!)… You then end up with files named like they are shown in Figure 8

Of course, you may not have to do all this stuff… I might have an incorrectly set-up version of K2 [BlackPoint], but thinking about it, it does make sense to me.

  1. InfoPath forms are ‘protected’ from unauthorised modification as they are stored in the database
  2. InfoPath forms are only stored in the library when they are in a complete state (because you decided when they would be stored!)

This is definitely an improvement on how BlackPearl handles them, as they are editable (through other means… e.g. text editor) when they are mid workflow, and it was a concern of the customer.

Anyways, I hope this helps others understand how to do this sort of thing. Post a comment if you want more explanation on anything, but of course, check out K2 Underground first!

SpittingCAML



Microsoft Support Lifecycle - .Net Framework 1.1, 2, 3 and SharePoint 2007 etc.

Just thought I’d post one last time before the new year as I want to be as far away from a computer as possible until the new year after today!

Microsoft Support Lifecycle policy provides consistent and predictable guidelines for product support availability at the time of product release. Microsoft will offer a minimum of 10 years support (5 years of Mainstream support and 5 years of Extended) for Business and Developer products.

For more detailed information on the policy or on lifecycle of a specific product, please go to the following web site: http://www.microsoft.com/lifecycle.

image
Figure 1: the typical Microsoft application/product lifecycle

Microsoft publishes a document every quarter along with an excel spreadsheet.

The objective is to highlight the next main support deadlines (end of support and change of support phase) affecting the major products along with the updated MSL.xls file.

Key points:

Action you should have already taken:

.Net Framework 1.1 RTM and SP1 were officially mainstream retired on October 14th, 2008, however it has extended support until October 8th, 2013

Immediate action required:

Office 2007 RTM, SharePoint Portal Server 2007 RTM, Project Server 2007 RTM, Visio 2007 RTM will NO longer be supported from January 13, 2009. It is recommended to upgrade to Service Pack 1 as soon as possible before this date. I am not sure on the situation with WSS 3.0, if you know the score can you post it as a comment on this post :-)

.Net Framework 2.0 RTM will NO longer be supported from January 13, 2009. It is recommended to upgrade to Service Pack 1 as soon as possible before this date.

.Net Framework 3.0 RTM will NO longer be supported from January 13, 2009. It is recommended to upgrade to Service Pack 1 as soon as possible before this date.

Short term action required:

Windows Server 2003 SP1 and Windows Server 2003 R2 RTM (based on SP1) will NO longer be supported from April 14, 2009. It is recommended to upgrade to Service Pack 2 as soon as possible before this date.

All this information is delivered via a subscription that you can sign up for! As you can see, some of this information is really good to know to plan your future architectures and development strategies.

You can subscribe: here

There is also an excellent blog: here

Once again, Merry Christmas and a Happy New Year to you all.

SpittingCAML



Official best practice and design patterns for SharePoint

Microsoft have recently released some material for SharePoint on their Patterns and Practices site:

http://www.microsoft.com/downloads/details.aspx?FamilyId=C3722DBA-6EE7-4E0E-82B5-FDAF3C5EC927&displaylang=en

SpittingCAML




You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.