Archive for the 'ASP.NET' Category

Report Viewer Control fails to render correctly in IE8

One of our more widely used application uses the Report Viewer Control to render SQL Server Reporting Services 2005 reports.

We are migrating to IE8 in the near future, so it was up to the various development teams to ensure compatibility. It should have been oh so easy :-)

Figure 1 shows the Report Viewer Control correctly rendering, a report is produced with twenty pages. Scrolling to the bottom of the control works as expected.

reportviewerissue003
Figure 1- Application renders correctly in IE6

Figure 2 shows the same page and content in IE8. All looks good until you scroll to the bottom of the report (shown in Figure 3).

reportviewerissue001
Figure 2 - All looks okay in IE8…. however….

reportviewerissue002
Figure 3 - Scroll bar not rendered correctly [circled in red], page number footer missing (it is visible in Figure 1 when rendered in IE6)

There is a workaround for this particular issue. It involves the DOCTYPE markup that you may have in your master page or each individual page.

You can read more information about DOCTYPES at Holly Bergevin’s page: DOCTYPE at Community MX

Removing it completely from your page will make the browser work in ‘quirks’ mode. You’ll find that enabling ‘quirk’ mode will ensure the report is rendered correctly. See MSDN social.

If like me, you’ve got your DOCTYPE in your master page, and you’d really like to keep your XHTML compliant markup in place for the majority of your pages, it is a little more tricky. Either code the DOCTYPE in each page (removing it from your Master page! yuck!)… or find another workaround.

I search in vain for a solution. If you can help, please comment on this article. It seems it is a known bug!

SC



IT in 2010… is it going to be as we predicted it?

With the new year almost upon us, I’ve been trying to theorise what skills will be required of developers in the next four years.

It is important to take stock of what you’ve got at the end of each year to ensure you offer training and support to the development team to help ensure they are productive with new technologies and tools.

As a developer myself, do I need to worry about keeping my job… ?

Ellen Fanning from Computer World, back in 2006, predicted outsourcing and the need to be business savvy was a major threat to IT workers.

John Newton from the CIO weblog (also from his own blog), back in 2007, predicted that content management would be improved and delivered in more human friendly ways. Business computing would shift to Blackberry type devices. User Interface design would be improved and take ideas from the gaming market.

There are a number of existing/emerging technologies that will impact my organisation in the near future (… I’m well aware that we are behind the curve on most of these, but give us a chance, and we’ll try to keep up :-)). Those are

  1. SharePoint 2010
  2. .NET 4.0
  3. ASP.NET Model View Controller 2
  4. jQuery
  5. Silverlight
  6. Windows Presentation Foundation

I’m sure there are many others.

We’ll also be trying to maximise productivity with our existing tools, such as the K2 BlackPearl / BlackPoint workflow suites.

It is has been a difficult time for many IT workers, when a business looks at what it can cut out of the budget, it usually means laying off staff, or the reduction of investment in their IT systems. Hopefully we can take heed with an article from Judi Hasson, Fiece CIO, who writes that IT is the key to recession recovery. Lets hope so!

Merry Christmas and a happy new year to all :-D

SC



Why should the FQDN make a difference when using Integrated Security authentication?

We run a few internal applications that are addressed using a FQDN:

http://site.domain.com/our_app

The applications are also available through the server name:

http://site/our_app

We’ve had a strange issue with one of our applications that requires ‘Integrated Security’ authentication.

A few of our users, who run IE6 (because that’s what they are forced to use) get prompted for credentials.

You’d assume that since IE knows who the user is, that it would simply provide it to the application, and it would allow access.

I’ve done a little digging at it would seem we are not the only people with this issue.

It would seem that this is a browser, rather than application related issue:

Best explanation award foes to Windows IT Pro:

It would seem the only short term solution is to provide the fix to the user community before we can update group policy (if this is possible!)

SC



Obfuscation for IP protection

Recently we’ve had lots of interest in an internally developed product from external organisations. Obviously if they’ve got enough interest to want to pay for it… why not sell it to them?

There’s a few issues to consider when the software was developed without a clear goal to make it into a saleable product:

  1. The software was never developed to be a product for general sale – it is likely to have missing requirements as it was developed for a single purpose (i.e. not generic enough)
  2. Licensing (the software) was not a requirement during the development – the developers did not know it was a requirement, this could have altered the design stratagem
  3. Database objects not created ‘WITH ENCRYPTION’ – a simple to fix issue, but its a PITA!
  4. Web application not written with obfuscation in mind – is it possible to reverse engineer our DLLs/Web Services?

The software we need to protect is a web application (ASP.NET 3.5 C#) with a SQL Server 2005 back end

I had some experience with obfuscating .NET since 1.1, and it seems lots of the issues from back in 2001/2 have now gone since the .NET language has moved on and become more optimised. There’s an interesting thread of discussion on Stack Overflow that might interest you. It discusses tools, reasons for doing it, and the potential pitfalls.

I’m not too worried about our IP going missing, as we will put a non-disclosure agreement in place, and the potential buyer would loose significant reputation and business from my organisation if they were to attempt to get out our precious source code.

Having looked at a few obfuscators, I’m tending to go for Eazfuscator.NET.  As this software package is under maintenance (internally) I didn’t want to make wholesale changes to the solution/project so it seemed the obvious choice. Simply use it on your web application DLL (outside of Visual Studio), and all is well.

In terms of licensing, we will need to think about a pricing model if orders do come through to door and then come up with a suitable licensing strategy.

Our main headache is going to be “… well, it kinda does what we want it to do but …” type questions. Our internal processes are almost certainly going to be different to those organisations wanting to use the package. Dealing with this, alongside the maintenance of an existing system is going to be challenging.

SpittingCAML



Delivery of an important project, thanks to some clever shortcuts

This blog has been quiet for the last couple of weeks because I’ve been leading a development team in the final stages of delivery of a pretty large asset/stock tracking system.

Its a long story as to why we decided to do it all ‘in house’ rather than to buy a COTS product. Lets just say, we’d already tried buying off the shelf and that’s the reason why we started doing our own.

The budget was very challenging and the development team grew large in the space of a few weeks when we realised that the estimated man hours required to do it was going to take more than four full time developers.

Communication

With a large development team, in disparate locations around the UK it is important to get the communication channels right. My organisation had rolled out text based communication similar to Live Messenger (Office Communicator). This proved the first clever shortcut. This tool allowed us to collaborate in real time, using virtual whiteboards.

WAN based software configuration management. This was very important, it meant we could all see the same page, and nothing was stuck on Fred’s laptop when he went on leave :-)

Hudson – continuous integration testing, ensured that we never dirtied our source code, so we didn’t hear any of the usual ‘why does it not build when I check it out?’ type crying from the team. I really liked the fact you could schedule builds and output the successful build number to SharePoint using it’s RSS feeds. This little gem saved the bacon many a time… although until the developers got used to not treating our source code control system as a backup for their work in progress, the automatic emails telling everyone (including the PM) that the build failed was a little annoying ;-)… I loved it, as there was no excuse for checking in ‘work in progress’  :-D

SharePoint 2007 for project documentation. For much the same reason why we had WAN based software configuration management. SharePoint helps anyone on the internal intranet see what we are up to … and the project manager can stick the GANTT and meeting minutes somewhere too :-)

Process

Those of you who have read my blog before will remember my ramblings about which software development methodology should you choose. We chose to use OpenUP, which is the open source version of the Rational Unified Process. I was a little sceptical about its adoption in the embryonic stages of the project, but it certainly helped us stay on course. It also ensured we followed a process that industry knows about. The iterative nature of these kinds of processes ensure that higher management get to see what us technical types are up to, which keeps the pressure off, because at the end of each iteration there is a defined output and testing stratagem.

Technology

We had to keep it all web based, so it could work over the intranet… and we were limited to IE6 because my organisation hates upgrading and is very risk adverse. I can hear you all groaning… but what about all the security patches etc. etc… yeah I know… I’ve sat across the table from the infrastructure guys and have tried to explain that, but we are only developing for our internal intranet… what could go wrong with that ;-) … basically it is out of my pay bracket to argue with the people that make that kind of decision.

As my department is well versed in C# and ASP.NET (3.5) this was our chosen technology base… we paired this with SQL Server, as we are also well versed in that.

During the design phase it was glaringly obvious that the standard AJAX toolkit combined with the standard ASP.NET controls was not going to cut the mustard with the requirements that had been elicited… and we needed rapid application development, and slick results.

We turned to Telerik’s ASP.NET AJAX control library and it saved us a small fortune. I’m sceptical about most third party products… and I was very sceptical about using it in such a ‘flagship’ project, however my fears were unfounded. It seems you can sell a product with these controls in, and you get all the source code for them… and you can modify the source code for them providing you let Telerik know what you did. They also provide an excellent forum for dealing with problems. If you use a search engine and look up ‘RadGrid’ the Telerik equivalent of a GridView or DataGrid, you still get hundreds of hits :-)

We used the windows style dialog manager (RadWindow and RadAlert), menu control (RadMenu), grid view (RadGrid), combo box (RadComboBox), numeric only text boxes with extras (RadNumericTextBox) and many of their date pickers to name a few. Not only did we use them, we also used them alongside existing AJAX and standard .NET controls with no issue. I think if we did not buy this toolset, we would have spent hundreds of hours doing what they have already done for us, and I bet ours wouldn’t have been as slick, or as tested as theirs. The development team and I weren’t sad that they’d taken the raw control building away from us (and I thought it would incite a riot by the coffee machine!). One of the most pleasing aspects about it is that the customer is happy that we have delivered a sophisticated interface that is user friendly. The window manager allows users to resize, drag and minimise functionality on our web pages much the same as they do on their windows desktop machines.

Our next cost saving came came with the ReportViewer control. We’d been bitten badly in previous projects when using Reporting Services. Is it just me, or does it not feel like a finished product? Problems we had in the past were cross domain access / the classic double hop problem (impersonation) and unstructured website navigation (that one was probably more our fault). The ReportViewer control gives us a little more freedom, we can construct dynamic object data sources in our business layer, and create reports on the fly, on the actual page rather than firing off our requests to the Reporting Services engine running on another site collection in IIS. This eliminates authentication issues and allows you to put the report that is generated anywhere on your page. Our user’s like it a lot :-).

SQL Server Integration Services (SSIS) is looking like it will save us money once we’ve worked out how to use it properly ;-). The COTS product that we’re replacing has a database backend that we can connect to using SSIS, and the customer is providing us data in excel. This is a useful took to help shape and data cleanse on the way into the new system

Final thoughts

User training is now underway and the User acceptance test is around the corner, and I think it will be a big success for the team.

SpittingCAML



Hudson – Continuous Integration Testing

For a while now we’ve been planning on making use of Hudson to help us maintain working builds in our source code configuration system.

What Hudson does, is build your software at a predefined schedule (like a SQL Server agent job, or Windows Scheduled task) or when requested by a user, and produces a dashboard showing the status of your software builds. It is highly configurable, and can link to many source code control systems to get the latest version of your latest code.

Even though this product was/is/seems to be (I’m not sure on this) aimed at the Java developers of this world, it can build pretty much any software package.

We intend to make use of it for the building of .NET applications by getting it to call MSBUILD via DOS Batch files (in my example), however I will point out that you can get Hudson to call MSBUILD through an MSBUILD plug-in not covered in this post.

I want to use another plug-in, the ‘Text Finder’ plug in to parse the output of the batch file for StyleCop errors and system build failures.

Please have a read of this: Meet Hudson, so you can get a proper introduction to it.

To be perfectly honest, I’m not a Java fan (as my colleagues will tell you ;-))… and I thought it was a little overkill to have to install a Java Runtime and Apache, and Hudson to perform automatic builds… however, it has been relatively painless… and as my organisation is not exactly liking to splash to cash at the moment, the fact that everything is free helps…

Those risk adverse, like myself, have to put aside the fact that it’s all open source… and the only support out there is through volunteers :-)… there’s no metaphorical stick to hit someone with if the product doesn’t ‘just work’ out of the open source box.

What I downloaded

  1. Java Runtime Environment (jre-6u14-windows-i586)
  2. Tomcat 5.5 (apache-tomcat-5.5.27)
  3. Hudson (hudson.war v1.314)
  4. Text Finder plug in (text-finder.hpi v1.7)

NB: If you are using IE8/7 to download hudson.war and text-finder.hpi you may find that the extension is changed to .zip. You will need to rename the files back to their original extension to get the installation to work correctly.

Setting up Hudson on Windows 2003/2008 Server alongside IIS

Obviously, since apache tomcat and Hudson are Java based applications, if you don’t have a Java Runtime environment (JRE), you’ll need to install that first. It is relatively painless, simply double click on the EXE installer and bob’s your uncle.

I had my doubts about getting Tomcat 5.5 working alongside IIS, but to my surprise (I haven’t touched Tomcat since my University days back in 2001) the installer worked well, and defaulted to using port 8080, which is nice, since I don’t want to be getting in the way of IIS on port 80. The installer *should* also detect if you’ve got a JRE installed and set the link to it up automatically. If it doesn’t, simply point the installer at the folder that you installed the JRE. You need to set an administrator account up to access the Tomcat manager.

The Tomcat application server should be alive (possibly following a reboot in certain circumstances) after the install and you should be able to navigate to it’s front page.

tomcat_001 
Figure 1: The tomcat front page

imageFigure 2: Entering tomcat manager

Open the Tomcat manager to instigate the Hudson installation. It will prompt you for the username and password you set up during the Tomcat installation before you can access the page.

Scroll down to the Deploy section, in the ‘WAR file to deploy’ section click browse, and select the hudson.war file we downloaded earlier.

tomcat_002
Figure 3: Deploying the Web Archive (WAR) for Hudson

Click the deploy button and the application should be deployed. If you put the WAR file in a part of the file system that gives a long file path, e.g. ‘C:\longfilepath\longfilepath\longfilepath\longfilepath\longfile\hudson.war’ you may have issues with the deployment. I certainly encountered this issue last week. The error message you will get isn’t the most useful, so it’s worth moving it to the root of a drive to see if that solves it.

To confirm successful deployment, look at the application list

tomcat_003
Figure 4: the application list, showing Hudson

If you click on the hyperlink ‘/hudson’ it should take you to the front page of the Hudson application.

tomcat_004 
Figure 5: The Hudson ‘dashboard’

You are now ready to go… as you might have noticed I’ve already created a Job - ‘Test 001’. This is the build that I’ve set up to hopefully explain to you as part of this post.

As I’m using the ‘Text Finder’ plug-in, you’ll now need to install that if you want to follow my example.

tomcat_005 
Figure 6: Managing Hudson, and adding a plug-in

Click ‘Manage Hudson’ and then on ‘Manage Plugins’, Click the ‘Advanced’ tab and scroll to the bottom of that page so you see the following:

tomcat_006
Figure 7: Uploading a plug-in

If now click the upload button, when it has finished, restart the Tomcat service. If you don’t perform a restart the plug-in wont be shown as installed.

image
Figure 8: list of installed plug-ins

Once installed, you should see it in the list of installed plug-ins.

We can now go about creating the job that will build the .NET application.

I’ve got a really simple .NET 3.5 website application (it does nothing other than to display default.aspx) that I’m using for this post.

tomcat_007
Figure 9: Visual Studio 2008 Web Application, the working folder on E: drive and the batch file in the root of the web application folder.

The batch file that Hudson will call is very simple, and I suspect it could be done better, however, here it is if you want to make use of it:

echo change directory to visual studio 2008 common tools folder
cd /d %VS90COMNTOOLS%
cd ../..
cd VC
echo set environment variables
call vcvarsall.bat;

echo call Test001.csproj (looks in the directory of this batch file for it)
call msbuild %~dp0Test001.csproj

Navigate to the Hudson ‘dashboard’/front page. And click ‘New Job’.

Provide Hudson with a name for your job, and select ‘Build a free-style software project’.

image
Figure 10: Free style software project selection

tomcat_008
Figure 11: Adding build steps

Leave everything else as standard for now, and click ‘Add build step’ and select the ‘Execute windows batch command’ option.

Enter the path to the batch file (as shown in Figure 12)

tomcat_009
Figure 12: entering the batch file details into build step

The next step is to configure the ‘text finder’ plug-in to look for the token ‘FAIL’, since MSBUILD produces messages with the word ‘FAIL’ in them.

tomcat_010
Figure 13: configuring Hudson to look for the token ‘FAIL’ in the console output.

Click the Save button, and your job has been created!

Navigate back to the Hudson dashboard, and click the ‘build’ icon next to Job ‘Test 001’ (as shown in Figure 14)

tomcat_011
Figure 14: Instigate a build

If the build was successful, when you refresh the page, you should see this:

image
Figure 15: The sunny picture indicates a very stable build

To demonstrate how Hudson picks up on failed builds, I’m now going to rename the code behind page for default.aspx from ‘default.aspx.cs’ to  ‘breakbuild.aspx.cs’.

image
Figure 16: Deliberately breaking the build

Using Hudson, run the job again.

image
Figure 16: The cloudy picture indicates a failure has occurred

The job has failed, the more the job fails, the worse the weather gets :-)

Run it a few more times to get more bleak weather (unless you like thunderstorms).

image
Figure 17: thunderstorms indicate that most recent builds have all failed

You can review the console output of all the builds that have taken place to help you diagnose failed builds.

tomcat_012
Figure 18: review console of failed builds

As you can imagine, with the text finder plug-in and the numerous others available for Hudson, it makes it a very powerful tool.

I intend to set ours up so it will notify the development team when the latest version of a system checked into our source control system will not build, or contains StyleCop warnings.

SpittingCAML



InfoPath and SharePoint verses ASP.NET and a Traditional Database verses ASP.NET and using SharePoint as a database technology

I was recently asked by a colleague

“I’ve got to build a new application to support x (an anonymous set of requirements that I cannot divulge here!), I’ve not got long to do it, and my developer resources are thin on the ground. I’ve heard you talk about SharePoint and InfoPath, and need to call on your experience, do you think I could develop my application using those two technologies? It requires a complex interface layer and needs to be able to provide neat looking reports.”

Okay I said, I’ll give you my experiences in the form of some potential solutions and potential pros and cons. I realise by posting this I’m likely to anger the gods and provoke some really large debate… but that was my plan all along :-)

 

So your decision basically is between three development strategies/options

  1. InfoPath and SharePoint 2007 (MOSS)
  2. ASP.NET and MOSS
  3. ASP.NET and SQL Server 2005

This means the first step is to consider the requirements for the interface layer (IL)… ask yourself: will the user want to do anything fancy on the front end? e.g. sorting data grids, combo boxes, interface with external system. If the answer to that is yes, then you’ll probably want to consider an ASP.NET front end.

If the user really only requires a simple form, then InfoPath is a good choice for the IL… but to make the waters even more murky you’ll need to consider the storage/reporting requirements as InfoPath on it’s own will only offer XML based storage, either on disk, email or SharePoint forms library. ASP.NET forms are more flexible and can enable you to store the data in a SharePoint list, database or if you really wanted, and XML file.

InfoPath pros and cons
Pros

  • Forms can be produced by pretty much anyone with no training
  • Simple to build prototypes (quick and cheap)
  • Easy for user’s to use and understand
  • Allows offline editing (by saving the form to local hard drive)
  • Doesn’t need to be designed in detail before development can be started

Cons

  • Which version of InfoPath does your corporate desktop/laptop build support? InfoPath 2003 is getting a little tired now (this means it’s old, and wont support newer controls, and will limit the ‘code behind’ that you can produce)
  • InfoPath does not allow you to build flexible, custom interfaces
  • Can’t reuse rules from other forms without having to recreate them
  • Rules are difficult to navigate/debug
  • Difficult to migrate (without reworking the forms)
  • If used in conjunction with an SharePoint form library, the coupling is very tight, so if you move the site/rename it you might have to alter the form

ASP.NET pros and cons
Pros

  • Can do whatever you like (within reason) as you have access to .NET 3.5. [this includes things like sending email etc.]
  • Can produce flexible interfaces
  • Easy to debug using Visual Studio
  • Can reuse code and layouts using classes and master pages
  • Can interface with SharePoint, SQL Server, Oracle, XML and lots of other ODBC compliant technologies

Cons

  • Requires that the developers have ASP.NET training
  • Prototypes take longer to build than in InfoPath
  • Does not allow offline use, without extensive development of a side by side offline system
  • Users may require training if something is ’specialised’
  • You need to design the pages (if you want a sensible solution)

You can also have a read of my blog: http://blog.mgallen.com/?p=206, where I’ve linked to Jason Apergis’ blog who explains the pros and cons in a workflow context, but he decides that InfoPath is better for his organisation.

Now you can compare traditional databases and SharePoint

SharePoint pros and cons
Pros

  • Easy to build sites and site collections (quick and cheapish)
  • Has plethora of web parts that can be dragged and dropped by novice users to create dynamic content
  • Links well with InfoPath
  • List items can be produced via MOSS API and Web Services from other technologies such as ASP.NET
  • Sites can be generated through the MOSS API
  • Does rudimentary version control (albeit not in the best possible way… perhaps this isn’t a pro after all :-))
  • Can create production level sites/storage facilities without a detailed design

Cons

  • It should not be used like a traditional database (… and can’t really be used like one either as it can’t do joins between lists)
  • Difficult to report from MOSS lists and libraries, although you can used Reporting Services to query lists it is generally more difficult compared to SQL queries
  • Uses lots of hard drive space (the MOSS database grows quite large)
  • It is not straight forward to migrate from a dev environment to a live environment

Traditional Database (e.g. SQL Server 2005)
Pros

  • Very flexible
  • Can use proper joins, sorts
  • Links very well with Reporting Services to produce powerful outputs
  • Links very well with ASP.NET and other .NET technologies

Cons

  • Requires a detailed design (or not… but don’t do that to yourself!)
  • Can’t be used directly with InfoPath
  • Requires a production and dev server in an ideal world

Okay, so if you read between the lines… I think you should go for options 2 or 3… preferably 3.

The perception is that as its quick and cheap to use InfoPath and SharePoint… and that perception is right 90% of the way…. You’ll find that once you’ve done 90%… The last 10% will take you an absolute age, and will probably consist of workarounds, squirming out of meeting requirements and swearing at the computer.

The decision is yours, so be pragmatic, and assess the requirements in front of you, and ask difficult questions to try to ascertain whether any potential requirements creep puts you in the ASP.NET frame or the InfoPath frame. If reporting is a major player, I would urge you to think about using SQL Server and Reporting Services.

I hope this has helped you a little bit anyway, good luck :-)

SpittingCAML



Microsoft Support Lifecycle - .Net Framework 1.1, 2, 3 and SharePoint 2007 etc.

Just thought I’d post one last time before the new year as I want to be as far away from a computer as possible until the new year after today!

Microsoft Support Lifecycle policy provides consistent and predictable guidelines for product support availability at the time of product release. Microsoft will offer a minimum of 10 years support (5 years of Mainstream support and 5 years of Extended) for Business and Developer products.

For more detailed information on the policy or on lifecycle of a specific product, please go to the following web site: http://www.microsoft.com/lifecycle.

image
Figure 1: the typical Microsoft application/product lifecycle

Microsoft publishes a document every quarter along with an excel spreadsheet.

The objective is to highlight the next main support deadlines (end of support and change of support phase) affecting the major products along with the updated MSL.xls file.

Key points:

Action you should have already taken:

.Net Framework 1.1 RTM and SP1 were officially mainstream retired on October 14th, 2008, however it has extended support until October 8th, 2013

Immediate action required:

Office 2007 RTM, SharePoint Portal Server 2007 RTM, Project Server 2007 RTM, Visio 2007 RTM will NO longer be supported from January 13, 2009. It is recommended to upgrade to Service Pack 1 as soon as possible before this date. I am not sure on the situation with WSS 3.0, if you know the score can you post it as a comment on this post :-)

.Net Framework 2.0 RTM will NO longer be supported from January 13, 2009. It is recommended to upgrade to Service Pack 1 as soon as possible before this date.

.Net Framework 3.0 RTM will NO longer be supported from January 13, 2009. It is recommended to upgrade to Service Pack 1 as soon as possible before this date.

Short term action required:

Windows Server 2003 SP1 and Windows Server 2003 R2 RTM (based on SP1) will NO longer be supported from April 14, 2009. It is recommended to upgrade to Service Pack 2 as soon as possible before this date.

All this information is delivered via a subscription that you can sign up for! As you can see, some of this information is really good to know to plan your future architectures and development strategies.

You can subscribe: here

There is also an excellent blog: here

Once again, Merry Christmas and a Happy New Year to you all.

SpittingCAML



Application Pool Manager Version 2 - Now Available

You can find the new version of Spencer Harbar’s tool: here

I blogged about the previous version in July when discussing the alternatives to IIS Reset.

You can still read my old post: here

SpittingCAML



Dead SQL Server 2005 box… could it be down to a cursor?

Today was an odd one… a colleague and I were doing some eXtreme Programming (XP)…. i.e. we needed to solve an issue with one of our many ASP.NET applications quickly!!

We were looking at a stored procedure that contained a cursor. We then fiddled a bit and commented out lots of it out to aid our debug… In our excitement to solve the problem we made the schoolboy error of not un-commenting the CLOSE and DEALLOCATE steps in the cursor when we reinstated it.

The web page linked to the stored procedure was loading in the browser… it took an age to appear… alarm bells were ringing! What had we done… was there an infinite loop cursor in progress?

I logged into SQL Server Management Studio to run a diagnostic query. The diagnostic query I was running was very similar to the excellent one Glenn Berry has published. It checked for obvious issues such as

  1. High CPU load operations
  2. Blocking queries
  3. Transaction log size
  4. Network and I/O issues

I was not unsurprised to see that the transaction log on tempdb was full. Not being a SQL expert in the slightest, I called the Operations Manager over for assistance. Remote Desktop to the machine was also non functioning… what had we done! Could a messed up cursor really cause all these issues?

Well, the investigations continue. If you experience similar issues to this, definitely look at Glen Berry’s SQL script and the following Microsoft support articles

  1. Causes of Transaction log expansion
  2. Unexpected transaction log growth

enjoy

SpittingCAML




You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.