Friday, 27 May 2011

Installing Windows 7 from USB Key

I’ve got new laptop, a Toshiba Portege R700 which as usual comes with quite a lot of bundled software pre-installed.  Also the version of Windows 7 installed is the 32 bit version.

I decided to put a clean install on there of Windows 7 x64 – which meant downloading all the device drivers from the Toshiba site ready to install after Windows.

To install Windows I followed this guide to install from a USB key rather than DVD.  The Windows install I had was an ISO image anyway, so rather than burn it to disk just for the install using the USB key seemed like a better option.  After all - optical drivers aren’t up to much, they’re slow, noisy and unreliable (well they have been for me anyway).  Using the USB key worked like a treat - seemed to install quicker too and the machine didn’t sound like it was about to take off into space.

System.Messaging.MessageQueueException: Insufficient resources to perform operation.

I came across this error today in one our web service applications:-

System.Messaging.MessageQueueException: Insufficient resources to perform operation.
   at System.Messaging.MessageQueue.SendInternal(Object obj, MessageQueueTransaction internalTransaction, MessageQueueTransactionType transactionType)
   at System.Messaging.MessageQueue.Send(Object obj, String label, MessageQueueTransaction transaction, MessageQueueTransactionType transactionType)
   at System.Messaging.MessageQueue.Send(Object obj, String label)

The service causing the error is a data import service.  It collects requests received during the day into an MSMQ and then processes it overnight.

The queue of requests only had about 10 messages in it, so was unlikely to be causing any resource issues.

However, when the data import service fails to process a message overnight, it sends it to a poison queue.  That queue had lots of messages in it.  At the moment the poison queue is left to fill up, we only use it when we’re fixing a specific problem with the import service.  Neither queue had a value set for “Limit message storage”.

It turns out that there’s a machine setting for limiting the amount of disk space given to MSMQ for storing messages.  It defaults to 1GB and sure enough the C:\WINDOWS\system32\msmq\storage had 1GB of data in it and no more. 

To solve it I backed up the queues using mqbkup and then purged the poison queue.  I also upped the limit to 4GB so the problem occurs less often.  But that’s not a real solution.  Really I’d like some monitoring to tell me when the queue is approach full so that I can do something about it.

Having a dig around I found a performance counter called MSMQ Service\Total bytes in all queues.  Should be able to monitor that and raise an alert when it approaches the limit.

Thanks to Ayende’s post on this one.  Also there’s a more complete list of possible causes from this post (this is number 7).

Sunday, 2 August 2009

Solid State Hard Drive for my developer machine

Solid state memory that makes up USB flash drives has quite recently become a real option for use in a system hard drive. The solid state disks typically come in sizes 30GB, 60GB, 120GB and 250GB, cost spiralling the larger you go.

I’m working with a number of large .NET solutions that I have to open, build and frequently chop and change between during the day. This gets quite frustrating waiting for the HD to spin up and load all those source files to do some work. Also all the code is controlled using SVN (and TortoiseSVN as a client) – which creates loads of small hidden files which are read a lot when moving around the code.

I read the blogs by Scott Hanselman and Joe Spolsky, two well respected developers on introducing SSD for their development machines. Further reviews and benchmarks on SSD drives looked like they would be a promising option for saving time on disk access during the day. I hesitate to say an “affordable”option because a 120GB disk costs around £260.

I found a 120GB OCZ Summit series on overclockers.co.uk for £260, and got one in. The Summit series is the in the “professional” series on the OCZ website compare to the lower performing Agility series (which costs around £30 less).

The drive I’m upgrading was a 74GB Western Digital Raptor, a really good drive in its day. I’ve got another 250GB drive in my machine which as partition with the MBR and Windows XP. The raptor drive runs Vista Ultimate.

I installed the new drive onto a spare SATA channel and booted up vista. I picked Acronis Migrate Easy 7.0 to copy the contents of the Raptor over to a new 74Gb partition on the SSD drive. That would leave 50GB free for my frequent working files, like source code.

After a couple of reboots, Acronis started up and cloned the contents of the Vista drive over to the new SSD. After that I powered down and put the SSD drive on the channel where the raptor used to live.

This caused me to get an error message when trying to boot into Vista, which was solved by popping in the Vista DVD and doing a repair. I figure that is repaired something in the MBR that had gone slightly awry in the clone.

Rebooted and the system was up and running perfectly with the new drive in the old one’s place. My performance benchmarks are fairly rudimentary, timing a few long tiresome disk jobs using a stopwatch.

Boot, login and get IE to load the home page:-
Before: 2m38s, After 1m10s

Multiple .NET solution compilation:-
Before: 5m02s, After 2m22s

Visual Studio Load of a solution:-
Before: 1m46s, After 58s

From the results you can see that disk intensive things took about half as much time. Now I think that’s a good return on the investment, should save plenty of time over the coming weeks.

Wednesday, 10 September 2008

First play with NBehave

I've been reading about Behaviour Driven Development and the tools around it, namely NBehave.

Behaviour Driven Development (Ok I'm going to call it BDD from here on ...) looks to be a great way of making the developer keep their focus on what their objectives are for the code their about to write. Test Driven Development (TDD) was the start of that, but it BDD takes it a step further and relates tests to the business value they're trying to achieve. I think every developer is guilty of jumping into code and forgetting where they came from or why they're there at some point.

I like code that reads closer to English, the sentence style of NBehave makes it easy to read the code and flow through and provides structure.

Dan North's article What's in a story? is a great introduction into the structure of user stories, and is worth a read even if BDD isn't something you're considering.

You can download NBehave from CodePlex and I'm using MbUnit as my testing framework, but I guess any other would work just as well.

Some Code...
Here's the source code for the quick example I wrote:-

The Test Code
Here's the testing code:-
    [Theme("Calculator"), TestFixture]
public class
Class1
{

[Story, Test]
public void Should_Add_Up_Positive_Numbers()
{
//create the instance being used in testing
Calc c = new Calc();

//create vars for test input
int a = 0;
int b = 0;

//create var that will have result filled
int result = 0;

//create the user story being tested
Story s = new Story("Should add up positive numbers");
s.AsA("Administrator")
.IWant("to have numbers add up")
.SoThat("I can get the total");

//setup scenario conditions and post conditions
s.WithScenario("Both numbers are positive")
.Given("First Number is positive", delegate { a = 5; })
.And("Second Number is positive", delegate { b = 6; })
.When("Numbers are added up", delegate { result = c.Add(a, b); })
.Then("Total is correct", delegate { Assert.AreEqual(11, result); });

}

}
The Production Code
Here's the production code to make the unit test pass:-
     /// <summary>
///
Sample calculator class
/// </summary>
public class
Calc
{
/// <summary>
///
Adds the specified a to b and returns result.
/// </summary>
public int Add(int a, int b)
{
return a + b;
}
}
Test Output
I like way the test code reads just like a user story definition, it prints out in the test output which could be useful for creating testing reports.
------ Test started: Assembly: NBehaveTesting.dll ------

Starting the MbUnit Test Execution
Exploring NBehaveTesting, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null
MbUnit 1.0.2700.29885 Addin
Found 1 tests
Story: Should add up positive numbers

Narrative:
As a Administrator
I want to have numbers add up
So that I can get the total

Scenario 1: Both numbers are positive
Given First Number is positive
And Second Number is positive
When Numbers are added up
Then Total is correct
[success] Class1.Should_Add_Up_Positive_Numbers
[reports] generating HTML report
TestResults: file:///C:/Users/peter.beams/AppData/Roaming/MbUnit/Reports/NBehaveTesting.Tests.html

1 passed, 0 failed, 0 skipped, took 2.60 seconds.

Summary
I hope this first example is useful to someone taking a look at NBehave. I'll be looking at BDD and NBehave in more detail soon, so more posts to follow. I'm sure there are some war stories of using NBehave in large projects, so I'm hunting for them!

Monday, 17 December 2007

File size limitation problem with XslCompiledTransform

I recently ran into a problem with a reporting feature on a product we're building and was very close to release. The report was being generated by a IHttpHandler in the ASP.NET 2.0 site that would transform an XML data document to RTF using an XSLT transform. We had around 20 reports that were working fine and generating without any problems. One XSLT transform was causing some major problems that even lead to crashing the application pool in IIS 6.0 when it was running.

The handler that was supposed to be returning the report as an attachment in the request was displaying a 'Page cannot be displayed' and the System event log had warnings logged whenever this happened:-

A process serving application pool 'DefaultAppPool' suffered a fatal communication error with the World Wide Web Publishing Service. The process id was '6244'. The data field contains the error number.
For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.
0000: 8007006d
We couldn't recreate the problem locally in Visual Studio web development server, it only seemed to be for this one report when running in Windows 2003 and IIS 6.0.

The Xslt engine being used was the standard XslCompiledTransform class in the .NET framework, the previous implementation of Xslt transforms XslTransform had been marked as obsolete - so all during development the team was working with XslCompiledTransform. The only difference we could see with the broken report and the working ones (after a while of checking and double checking other probable causes) was that the Xsl file for that report was larger (~480Kb).

The migration guide in MSDN makes no mention of there being a file size limitation when using XslCompileTransform, but we were able to find this forum post that explains the issue we were having. The problem is due to the JIT compiler for Xslt transformation running out space for declaring locals during IL compilation. The suggestion is to split the XSL into smaller transforms, but we took an alternate route and switched back to using XslTransform in the code. The older class handles the large report transformation - it might obsolete, but it works!

Building reports like this way isn't the most elegant solution, but we hit this problem late in the development cycle that it was far too late to consider switching to a different method. It's a little disappointing that MS aren't more explicit about that limitation in the migration guide.

Thursday, 5 April 2007

Testing ASP.NET SOAP Web Services

I've been developing some ASP.NET SOAP services for exposing some application logic to an associated application being developed by a colleague using Flash and Flash Media Server.

To test the SOAP calls before deploying the services ready to be consumed by Flash, I needed to test the output from the services under certain conditions.

When running the ASP.NET site from Visual Studio 2005, you can invoke the SOAP call from the browser. This is fine for testing service operations that don't take any parameters, or parameters that primitive types (like strings and integers). If the operation takes a complex type like an instance of a class you've written or complex .NET type like System.Guid - they you can't invoke them this way.

You could write a consumer application for web methods that can't be executed from the browser, but that means there's more code to write - and I'd rather see the service working outside of .NET to assure me that it will do the job for the Flash and Flash Comms Server parts of the application.

soapUI is a free and open source java based SOAP tester that allows you to prod and poke a web service with relative ease. Its available as a standalone java app, or as a plug-in to popular tools like Eclipse. I downloaded and installed the java installer from soapUI SourceForge Project.

To test the service you just need to point soapUI to the WSDL description for the service you want to test. ASP.NET generates that document for you so the URL looks something like http://myServer/myService.asmx?wsdl. soapUI then uses the WSDL to give you a template for a SOAP request where you can fill in the data values you want to test with.






After that's done, hit the Submit option and the response from the service is displayed in the right hand pane. It's complete with the SOAP headers and the data that's been returned. If there's an exception thrown by the service call, then the exception will be displayed instead.









Later on I'll need to set up some unit tests and regression tests for the services, but soapUI is a great tool for doing the ad-hoc testing I need at the moment.

Wednesday, 28 March 2007

Wiki Log book

Last week I downloaded and setup MediaWiki on my local development machine. As the MediaWiki site will tell you it runs using PHP extensions for IIS and is powered by MySQL database - both tools that can run along side my .NET development environment without causing conflict. My original reason for downloading and installing it was to evaluate how hard it would be to deploy / adapt for one of our clients that might require a Wiki site later in the year. I have, however, found having a Wiki running locally to be a valuable tool while I develop.

One of the core challenges of software development has always been keeping useful and current documentation. It always seems to be one of the first parts of a project that is dropped when a deadline gets tight, if was even included in the first place. With the Wiki running, I can keep it open as I write code so that I can add to my high level documentation as I'm going. Writing documentation as you go is bound to make it more accurate and useful to other developers that might have to pick up your work.

My advice... throw away all those scraps of paper on your desk, and even that notebook (if its like mine it hasn't been touched for a few weeks anyway - who uses pens these days?) .. And install MediaWiki today!