Tuesday, 22 February 2011

Improving “Boiler Plate” Data-Reader Code – Part 2

In Part 1 of this series we started with a basic Data-Reader / SQL Connection/Command pattern and illustrated how it is possible to abstract the parsing of the Data Reader into a standalone object that can be fully unit tested in isolation of the calling code.   In Part two of the series we will highlight a very simple optimisation that can be made to the “DataReader” convertor and the required update to the tests to capture/verify the changes.  In this revision the original “CustomerDRConvertor” has been updated to include extremely basic caching, which for the duration of the object’s existence should ensure that only the first call needs to reference the “GetOrdinal(…)” method to find the element index of each desired column.  Subsequent calls can then use this “cached” index to reference the column by position rather than name.

namespace DataAccess.Example
{
using System.Data;
using System.Data.BoilerPlater;

public class CustomerDRConvertorPart2 : IConvertDataReader<Customer>
{
private int idIndex = -1;
private int firstNameIndex = -1;
private int surnameIndex = -1;

public Customer Parse(IDataReader dataReader)
{
if (idIndex == -1)
{
idIndex = dataReader.GetOrdinal("Id");
firstNameIndex = dataReader.GetOrdinal("FirstName");
surnameIndex = dataReader.GetOrdinal("Surname");
}

return new Customer
{
Id = dataReader.GetGuid(idIndex),
FirstName = dataReader.GetString(firstNameIndex),
Surname = dataReader.GetString(surnameIndex)
};
}
}
}

In traditional ASP applications (back in the day) the above caching pattern used to result in reasonable performance gains.   I’ve not looked into the benefits within a modern day .NET application and in some instances could be classed as premature optimisation.  But for the purpose of this example it provides a perfect illustration as to how the abstracting the data reader parsing from the connection/command code can provide many benefits.  Updated objects can be developed and tested in complete isolation of the existing code and then plugged into the code base with only minimal changes.

This updated code can be verified using the unit test below.  In the test the “Parse(…)” method is called once and the mocked objects are verified that they were called correctly.  The “Parse(…)” method is then called again and the mocked objects verified to make sure that the second call only resulted in an additional call to the GetGuid(…) and GetString(…) methods.  Due to the very basic caching that was implemented there is no need for the second call to make any GetOrdinal(…) references, which the verification of the mocked objects can confirm.  The tests verify the expected behaviour, not the inner workings of any implementation of a DataReader object.

namespace DataAccess.Example.Tests
{
using System;
using System.Data;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using NUnit.Framework;
using Moq;
using Assert = NUnit.Framework.Assert;

[TestClass]
public class CustomerDRConvertorPart2Tests
{
[TestMethod]
public void CustomerDRConvertor_GoodCall()
{

var dataReader = new Mock<IDataReader>();

dataReader.Setup(dr=>dr.GetOrdinal("Id")).Returns(1);
dataReader.Setup(dr=>dr.GetOrdinal("FirstName")).Returns(2);
dataReader.Setup(dr=>dr.GetOrdinal("Surname")).Returns(3);

var id = Guid.NewGuid();
const string firstName = "John";
const string surname = "Doe";

dataReader.Setup(dr=>dr.GetGuid(1)).Returns(id);
dataReader.Setup(dr=>dr.GetString(2)).Returns(firstName);
dataReader.Setup(dr=>dr.GetString(3)).Returns(surname);

var convertor = new CustomerDRConvertorPart2();

var customer = convertor.Parse(dataReader.Object);

Assert.That(customer.Id, Is.EqualTo(id));
Assert.That(customer.FirstName, Is.EqualTo(firstName));
Assert.That(customer.Surname, Is.EqualTo(surname));

dataReader.Verify(dr=>dr.GetOrdinal(It.IsAny<string>()), Times.Exactly(3));
dataReader.Verify(dr=>dr.GetOrdinal("Id"), Times.Once());
dataReader.Verify(dr=>dr.GetOrdinal("FirstName"), Times.Once());
dataReader.Verify(dr=>dr.GetOrdinal("Surname"), Times.Once());

dataReader.Verify(dr=>dr.GetGuid(It.IsAny<int>()), Times.Once());
dataReader.Verify(dr=>dr.GetGuid(1), Times.Once());

dataReader.Verify(dr=>dr.GetString(It.IsAny<int>()), Times.Exactly(3));
dataReader.Verify(dr=>dr.GetString(2), Times.Once());
dataReader.Verify(dr=>dr.GetString(3), Times.Once());

convertor.Parse(dataReader.Object);

dataReader.Verify(dr=>dr.GetOrdinal(It.IsAny<string>()), Times.Exactly(3));
dataReader.Verify(dr=>dr.GetOrdinal("Id"), Times.Once());
dataReader.Verify(dr=>dr.GetOrdinal("FirstName"), Times.Once());
dataReader.Verify(dr=>dr.GetOrdinal("Surname"), Times.Once());

dataReader.Verify(dr=>dr.GetGuid(It.IsAny<int>()), Times.Exactly(2));
dataReader.Verify(dr=>dr.GetGuid(1), Times.Exactly(2));

dataReader.Verify(dr=>dr.GetString(It.IsAny<int>()), Times.Exactly(4));
dataReader.Verify(dr=>dr.GetString(2), Times.Exactly(2));
dataReader.Verify(dr=>dr.GetString(3), Times.Exactly(2));
}
}
}

In part three of this series I will cover how the above code can be moved into an abstract base class for data access that all inheriting classes can utilise through interfaces and generics.

Part 3 builds on the code developed in parts 1 & 2 into a usable solution.

Monday, 21 February 2011

Improving “Boiler Plate” Data-Reader Code – Part 1

Recently we’ve been looking at improving our unit test code coverage and reducing the amount of duplicated code around our bespoke data access layer.  Where possible we have moved over to NHibernate but certain parts of the data access must still be written using the standard ADO.NET connection/command pattern.  Typically hidden right in the middle of this bespoke code is a while loop that is pivoting a data reader into a POCO that is impossible to repeat-ably unit test in a stable environment unless you set up a dedicated data repository for testing or try to wrap up / mock the connection / command objects.   Neither of these options are really desirable as we aren’t really interested in testing / mocking the .NET provider data access objects.

To get around this issue we looked into how we could generate some boiler-plate code that we could roll out across our code base.  This code base will be introduced step by step during this series with the first step covering the abstraction of the data reader processing into a standalone object that can be tested in isolation of the data access code.   The example code that we are looking to migrate is shown below, a typical unrestricted “Get()” call (in this example we don’t have many customers!).

public IEnumerable<Customer> Get()
{
using(var connection = new SqlConnection(this.config.ConnectionString))
{
using(var command = connection.CreateCommand())
{
command.CommandText = "SELECT id, Firstname, Surname FROM Customer";
connection.Open();

using (var dataReader = command.ExecuteReader())
{
while(dataReader.Read())
{
yield return new Customer
{
Id = dataReader.GetGuid(dataReader.GetOrdinal("Id")),
FirstName = dataReader.GetString(dataReader.GetOrdinal("FirstName")),
Surname = dataReader.GetString(dataReader.GetOrdinal("Surname"))
};
}
}
}
}
}

Hidden below (and behind) a concrete implementation of a SqlConnection and and SqlCommand is the code that we are interested in right now:

yield return new Customer
{
Id = dataReader.GetGuid(dataReader.GetOrdinal("Id")),
FirstName = dataReader.GetString(dataReader.GetOrdinal("FirstName")),
Surname = dataReader.GetString(dataReader.GetOrdinal("Surname"))
};

As previously stated, unless we set up a test repository or find a way to wrap / mock these concrete instances we are unable to test the creation from the data reader of the customer POCO. To rectify this we start with an interface defining how we would like the calling code to convert the passed in data reader:

namespace System.Data.BoilerPlate
{
public interface IConvertDataReader<out T>
{
T Parse(IDataReader dataReader);
}
}

Taking a very basic customer POCO object:

namespace DataAccess.Example
{
using System;

public class Customer
{
public Guid Id { get; set; }
public string FirstName { get; set; }
public string Surname { get; set; }
}
}

An implementation of the IConvertDataReader interface can be created quite simply using the code below.  The validation / error checking around each data reader value conversion could be as simple or as complex as you need. Part two of this series will cover a basic optimisation that can be made to this code to potentially speed up access in repeated calls such as a “while(dataReader.Read())” loop.

namespace DataAccess.Example
{
using System.Data;
using System.Data.BoilerPlate;

public class CustomerDRConvertor : IConvertDataReader<Customer>
{
public Customer Parse(IDataReader dataReader)
{
return new Customer
{
Id = dataReader.GetGuid(dataReader.GetOrdinal("Id")),
FirstName = dataReader.GetString(dataReader.GetOrdinal("FirstName")),
Surname = dataReader.GetString(dataReader.GetOrdinal("Surname"))
};
}
}
}

This data reader convertor can now be unit tested in isolation of the code that will be implementing it, using a combination NUnit and Mock to both assert actual returned results and verify expected behaviour. It’s probably worth highlighting at this stage we are using a combination of MsTest and NUnit, doing things this way brings the best of both worlds – in TFS you get automated Unit Testing as part of the CI build as you are referencing MsTest, but by adding an alias to NUnit’s assert you are getting access to NUnit’s fluent API (which we prefer).

namespace DataAccess.Example.Tests
{
using System;
using System.Data;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using NUnit.Framework;
using Moq;
using Assert = NUnit.Framework.Assert;

[TestClass]
public class CustomerDRConvertorTests
{
[TestMethod]
public void CustomerDRConvertor_GoodCall()
{
var dataReader = new Mock<>IDataReader>();
dataReader.Setup(dr => dr.GetOrdinal("Id")).Returns(1);
dataReader.Setup(dr => dr.GetOrdinal("FirstName")).Returns(2);
dataReader.Setup(dr => dr.GetOrdinal("Surname")).Returns(3);

var id = Guid.NewGuid();
const string firstName = "John";
const string surname = "Doe";

dataReader.Setup(dr => dr.GetGuid(1)).Returns(id);
dataReader.Setup(dr => dr.GetString(2)).Returns(firstName);
dataReader.Setup(dr => dr.GetString(3)).Returns(surname);

var convertor = new CustomerDRConvertor();
var customer = convertor.Parse(dataReader.Object);

Assert.That(customer.Id, Is.EqualTo(id));
Assert.That(customer.FirstName, Is.EqualTo(firstName));
Assert.That(customer.Surname, Is.EqualTo(surname));

dataReader.Verify(dr => dr.GetOrdinal(It.IsAny<string>()), Times.Exactly(3));
dataReader.Verify(dr => dr.GetOrdinal("Id"), Times.Once());
dataReader.Verify(dr => dr.GetOrdinal("FirstName"), Times.Once());
dataReader.Verify(dr => dr.GetOrdinal("Surname"), Times.Once());
dataReader.Verify(dr => dr.GetGuid(It.IsAny<int>()), Times.Once());
dataReader.Verify(dr => dr.GetGuid(1), Times.Once());

dataReader.Verify(dr => dr.GetString(It.IsAny<int>()), Times.Exactly(3));
dataReader.Verify(dr => dr.GetString(2), Times.Once());
dataReader.Verify(dr => dr.GetString(3), Times.Once());
}
}
}

Obviously the above is a very basic test, but any solution can be scaled to reflect any addition or complex processing that may take place in the implementation of Parse(IDataReader dataReader) method. In fact in part two of this series we will highlight how we can implement some potential implementation and verify the expected behaviour whilst confirming that the returned results have not changed.

Finally to wrap up part one of this series here is the original code updated to reflect the changes discussed here. We’ve not really changed a lot, but are now able to unit test code that was previously difficult to easily reach.


public IEnumerable<Customer> Get()
{
using (var connection = new SqlConnection(this.config.ConnectionString))
{
using (var command = connection.CreateCommand())
{
command.CommandText = "SELECT id, Firstname, Surname FROM Customer";
connection.Open();
using (var dataReader = command.ExecuteReader())
{
var convertor = new CustomerDRConvertor();
while (dataReader.Read())
{
yield return convertor.Parse(dataReader);
}
}
}
}
}

Part 2 of this series shows how the code covered in this post can easily be optimised to increase performance, with tests updated to reflect the change.

Saturday, 5 February 2011

MVC3.0 Installation Hangs

I've just installed MVC3.0 on a fresh PC using the new web installer application and was surprised at how long it seemed to be taking.  Digging around a bit deeper I remembered that in the options I'd selected to use IIS rather than IIS Express and taking a quick look at the service panel highlighted that IIS was currently stopped.  I restarted IIS and the MVC3.0 installation finished in seconds!

So if you're having problems installing MVC3.0 then just take a moment to check that IIS is started.

Friday, 4 February 2011

VS2010: How to change a "Class Library" project into a "Test Project"

Whilst working with VS2010 projects it can be really frustrating if you accidently create your unit testing projects as class libraries (or migrate an existing class library into a unit testing library mainly because the context sensitive "Add New" menu no longer contains the "New Test" option.  This can be easily fixed by directly amending the project file, adding the following key to main tag:

<ProjectTypeGuids>{3AC096D0-A1C2-E12C-1390-A8335801FDAB};{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}</ProjectTypeGuids>

Reload the project in VS2010 and now your class library has become a testing library and you have "Add New Test" option(s) back again.