Monday, December 12, 2011

Metrics in Brownfield Applications

Brownfield applications are everywhere, and as we all know, in most of the cases their codebases are quite messy.

During the last weeks, I have been reading a really nice book called "Brownfield Application Development in .Net" (by Kyle Baley and Donald Belcham). If you are currently working in a brownfield project and have the feeling that you could improve your career finding another job, I would recommend you to read this book first. And then, if after reading it you can't improve the application, you can go and find a better job. As they say, "change the environment, or change the environment" :).

Working in a brownfield project means that you have a version of the application running in production, probably a large amount of bugs to fix, some new features to add and a lot of places where the code could be improved.

So now the question is: how can we convince our manager to let us spend some of our time improving the existing code instead of adding new features?

Certainly that is not an easy task, but here there are some good reasons to do it:
  • Experienced developers don't want to touch messy codebases, so your manager will end up having a lot of junior developers and staff turnover. 
  • Changing existing features or adding new ones is easier when the codebase is clean (and probably the amount of time required is less). 
  • When developers are proud of the codebase, they are more committed with the quality and don't let anyone to mess it up. 
So, how do metrics can help to achieve this? Well, managers love numbers, if they see numbers, they can measure progress, they can show graphs to their bosses and they can understand (or at least try to) how the application is improved.

Here you have a list of metrics that can be useful in a brownfield application (for more details, see chapter 5 of the book):
  1. Code Coverage: percentage of the code that is covered by tests. 
  2. Cyclomatic Complexity: how complex is the application, measuring the numbers of paths in the code (if, loops). 
  3. Class Coupling: how dependent are the classes on other classes (there are two types: afferent and efferent). 
  4. Cohesion: how strongly-related are functions and responsibilities. 
  5. Distance form the main sequence: using abstractness and instability determines which classes are in the zone of pain and which ones are in the zone of uselessness. 
  6. Depth of inheritance: how many levels of inheritance the classes have. 
  7. Maintainability Index: using complexity and lines of code determines  how maintainable the code is. 


Selecting and measuring some of these metrics will help you to understand which are the pain points and will allow you to show your manager the improvements the team has done.

For more information about metrics you can check this post of Scott Hanselman and this document about NDepend metrics.

As a final thought, just a warning, metrics are just that: metrics. They should be applied with common sense. Be careful managing the manager's expectations and be sure that the entire team agrees with them.

Wednesday, July 20, 2011

MsmqException (0xC00E0051) 60 seconds after a WCF Service is executed

I know that this is the type of posts that only few people will find interesting.
But so many times I googled about some weird exception message, found that another developer wrote about it and saved me hours of work, that now, after spending several days trying to solve this issue, I feel an obligation to post about it.


Symptoms


When using NetMsmqBinding with a transactional queue and a Service is exposed with more than one Endpoint (two o more svc files with the same service type defined in the markup), a MsmqException is sometimes thrown just 60 seconds after the message is processed.

MyService1.svc
<%@ServiceHost language="c#" Debug="true" Service="Service" %>

MyService2.svc
<%@ServiceHost language="c#" Debug="true" Service="Service" %>

Web.config
<service name="Service">
  <endpoint contract="IService1" binding="netMsmqBinding" 
    address="net.msmq://localhost/private/MsmqService/Service1.svc" />
  <endpoint contract="IService2" binding="netMsmqBinding" 
    address="net.msmq://localhost/private/MsmqService/Service2.svc" />
</service>

Code
public class Service : IService1, IService2 {...}

Exception
System.ServiceModel.MsmqException (0xC00E0051): An error occurred while receiving a message from the queue: Unrecognized error -1072824239 (0xc00e0051). Ensure that MSMQ is installed and running. Make sure the queue is available to receive from. at System.ServiceModel.Channels.MsmqInputChannelBase.TryReceive(TimeSpan timeout, Message& message) at System.ServiceModel.Dispatcher.InputChannelBinder.TryReceive(TimeSpan timeout, RequestContext& requestContext) at System.ServiceModel.Dispatcher.ErrorHandlingReceiver.TryReceive(TimeSpan timeout, RequestContext& requestContext)

Note: You won't see the exception unless you enable the WCF Trace (system.diagnostics) or add an IErrorHandler.


Cause


This issue occurs because the SMSvcHost.exe is activating two ServiceHost instances using the *same* endpoint when a new message arrives.


While the first instance receives and process the message, committing the corresponding COM+ transaction, the second instance creates another COM+ transaction and waits for another incoming message.

After 60 seconds, the second transaction is aborted (COM+ timeout error 0xC00E0051) and the exception is thrown.


Resolution


Instead of creating one Service with two Endpoints, you need to create two separate Services with just one Endpoint each of them.

MyService1.svc
<%@ServiceHost language="c#" Debug="true" Service="Service1"%>

MyService2.svc
<%@ServiceHost language="c#" Debug="true" Service="Service2"%>

Web.config
<service name="Service1">
  <endpoint contract="IService1" binding="netMsmqBinding" 
    address="net.msmq://localhost/private/MsmqService/Service1.svc" />
</service>
<service name="Service2">
  <endpoint contract="IService2" binding="netMsmqBinding" 
    address="net.msmq://localhost/private/MsmqService/Service2.svc" />
</service>

Code
public class Service1 : IService1 {...}
public class Service2 : IService2 {...}

After doing that, only one instance of the ServiceHost is created, so the COM+ timeout exception is not thrown anymore.

I have created a bug in connect.microsoft.com: http://connect.microsoft.com/wcf/feedback/details/680020/msmqexception-0xc00e0051-60-seconds-after-a-wcf-service-is-executed


Additional Information


When the exception is thrown, sometimes the ServiceHost faults and the SMSvcHost.exe process doesn't call it again.

The following fix seems to solve this issue: 2504602 (download).

Another way to solve this is to attach a handler to the ServiceHost.Faulted event and re-create the ServiceHost every time this happens (for more details see this link).


Monday, June 20, 2011

Converting any object to dynamic

There are some scenarios in which it is useful to convert an already created object to a dynamic one, so then we can take advantage of the Dynamic Language Runtime introduced in .Net 4.0.


Two common scenarios where this conversion is useful are:
  • Passing an anonymous type to another method/object.
  • Adding a new property to an existing object.
The following code converts any object to an ExpandoObject reading all the properties of the former and adding them to the latter.

public static class DynamicExtensions
{
    public static dynamic ToDynamic(this object value)
    {
        IDictionary<string, object> expando = new ExpandoObject();

        foreach (PropertyDescriptor property in TypeDescriptor.GetProperties(value.GetType()))
            expando.Add(property.Name, property.GetValue(value));

        return expando as ExpandoObject;
    }
}

Using ToDynamic() to pass an anonymous type to another method

IEnumerable<dynamic> goldCustomersWithZipCode = Customers
  .Where(c => c.Type == CustomerType.Gold)
  .Select(c => new { c.Id, c.Name, c.Address.ZipCode  }.ToDynamic());

ListCustomersByZipCode(goldCustomersWithZipCode);

Using ToDynamic() to add a new property

Customer customer = Customers.First(c => c.Id == 1);
dynamic customerWithLastPurchase = customer.ToDynamic();
customerWithLastPurchase.LastPurchase = customer.Purchases.Last();

DisplayCustomerSummary(customerWithLastPurchase);

Thursday, April 21, 2011

Entity Framework POCO Proxies in Asp.Net MVC

As many of you already know, the asp.net MVC framework has the ability of creating a model instance each time an http post is executed. In the following picture you can see that the component that performs that task is the ModelBinder.


It is also true that the way in which Entity Framework implements "lazy load" on POCOs is replacing the real objects with proxies. So when the context returns an entity from the database, a proxy is returned instead of the real entity (for more information about EF proxies just follow this link).


But, what if the entity is not created by the context? How can the ModelBinder create a proxy instead of the real entity?

In order to do that, we need to call the Create method of the EF DbSet and it will create the proxy for us. The following snippet shows how:

var productProxy = modelContext.Products.Create();

Now let's put this behavior inside our own ModelBinder:

public class EntityModelBinder : DefaultModelBinder
{
    protected override object CreateModel(ControllerContext controllerContext,
                                          ModelBindingContext bindingContext, 
                                          Type modelType)
    {
        // Get the EF context using an IoC container
        var modelContext = DependencyResolver.Current.GetService<IModelContext>();

        var set = modelContext.Set(modelType);

        if (set != null)
            return set.Create(modelType);

        return base.CreateModel(controllerContext, bindingContext, modelType);
    }
}


Finally, we need to configure our new ModelBinder in the Application_Start (Global.asax).

public class MvcApplication : System.Web.HttpApplication
{
    protected void Application_Start()
    {
        ...
        ModelBinders.Binders.DefaultBinder = new EntityModelBinder();
    }
}

So now, we can use the lazy load feature inside our controllers:

public class ProductController : Controller
{
    [HttpPost]
    public ActionResult Edit(Product product)
    {
        modelContext.Entry(product).State = EntityState.Modified;

        // Use "lazy load" to get the Category object from the DB
        var threshold = product.Category.PriceThreshold;

        ...
    }
}

The source code of this sample can be downloaded from this link (MVC v3 + EF v4.1).

 

Friday, February 18, 2011

N-Tiers using POCOs and Entity Framework - Part Six: Source Code

I have finally managed to find some time to publish the source code of the post series I wrote some months ago (about how to build an n-tier application using Entity Framework and POCOs).

As this is a complete solution integrated with an IoC container (in this case, MEF), some of the already published code has changed. I have updated the previous posts so that both the source code and the snippets are synchronized.

You can download the source code from here: 


Hope this helps.

Posts in this series:

  1. Architecture
  2. Model and Entities
  3. Presentation Layer
  4. Business Layer
  5. DataAccess Layer
  6. Source Code

Wednesday, January 12, 2011

OData validation using DataAnnotations

WCF DataServices (OData) becomes more popular every day. If you need to expose data between different applications or even between different tiers of the same application, it is definitely a really good option.

Unfortunately, one of the missing features of the current version (v4.0) is the validation using DataAnnotations (if you want that feature to be implemented in the next version, just vote for it here). In this post I am going to show how to implement this validation using a ChangeInterceptor. I hope it helps you.

Update [24 Feb 2011]: OData team is working in a better solution to this scenario, for more details click here.



Here we have a Customer object (POCO) that is mapped in the Entity Framework model:

public class Customer
{
    public int Id { get; set; }

    [Required]
    public string Name { get; set; }
}

As you can see, the Name property is decorated with the Required attribute (DataAnnotations).

In the following code snippet we can see the WCF DataService that is exposing the Customers EntitySet:

public class WcfDataService : DataService<DatabaseEntities>
{
    public static void InitializeService(DataServiceConfiguration config)
    {
        config.SetEntitySetAccessRule("Customers", EntitySetRights.All);
        config.DataServiceBehavior.MaxProtocolVersion =
                                          DataServiceProtocolVersion.V2;
    }
}

So far nothing new. The first thing we need to do in order to add the validation logic is to create a ChangeInterceptor method:

public class WcfDataService : DataService<DatabaseEntities>
{
    ...

    [ChangeInterceptor("Customers")]
    public void ValidateCustomers(Customer customer, UpdateOperations operation)
    {
       // Validation logic
    }
}

After that, we just need to add the following validation logic:

[ChangeInterceptor("Customers")]
public void ValidateCustomers(Customer customer, UpdateOperations operation)
{
    // Only validates on inserts and updates
    if (operation != UpdateOperations.Add && 
        operation != UpdateOperations.Change)
        return;

    // Validation
    var validationContext = new ValidationContext(customer, null, null);
    var result = new List();
    Validator.TryValidateObject(customer, validationContext, result, true);

    if(result.Any())
        throw new DataServiceException(
            result
            .Select(r => r.ErrorMessage)
            .Aggregate((m1, m2) => String.Concat(m1, Environment.NewLine, m2)));
}

As you can see I am using the Validator class (System.ComponentModel.DataAnnotations namespace) and I am throwing a DataServiceException in case the validator finds any error.

Edit: The last parameter of the TryValidateObject method (validateAllProperties) should be set to true, otherwise it will only validate the [Required] attribute. More about this issue here.

If you are using the MetadataType attribute instead of having the DataAnnotation attributes in your POCO, you need to add some additional lines of code in order to support that. Check the attached source code to see how to do it.

Finally, in the client we are going to receive a DataServiceRequestException that will contain the error we sent from the server.

try
{
    var serviceUri = new Uri("http://localhost:4799/WcfDataService.svc")
    
    var context = new DatabaseEntities(serviceUri);

    var customer = new Customer();

    context.AddToCustomers(customer);

    Console.WriteLine("Calling data service...");

    context.SaveChanges();

    Console.WriteLine("Insert successful");

}
catch (DataServiceRequestException ex)
{
    if(ex.InnerException != null && ex.InnerException.Message != null)
        Console.WriteLine(ex.InnerException.Message);
    else
        Console.WriteLine(ex.ToString());
}



As you can see, our original error message is now wrapped inside an xml document (the serialized original exception). If you need to get the original message from there, you can either read the xml or use the code shown by Phani Raj in this post.

To download the source code, just click here.

Sunday, December 12, 2010

Soft-Delete and Entity Framework

It is very common to find enterprise applications where entities shouldn't be removed from the database, but just "marked" as deleted. If you are using Entity Framework, you can use the following approach. I hope this post helps you!

Firstly, we need to identify which entities are the ones that should support the soft-delete behavior. In order to do that we can use the following interface:

public interface ISoftDeleteEntity
{
bool Deleted { get; set; }
}
We can use partial classes in order to implement the interface, so there is no need to modify the auto-generated classes.


public partial class Customer : ISoftDeleteEntity
{
}
Finally, we need to override the SaveChanges of the ObjectContext in order to get the deleted entities and run the soft-delete logic.

public partial class DatabaseEntities
{
public override int SaveChanges(SaveOptions options)
{
var deletedEntities = GetDeletedEntities();

SoftDelete(deletedEntities);

return base.SaveChanges(options);
}

private List<ISoftDeleteEntity> GetDeletedEntities()
{
return ObjectStateManager
.GetObjectStateEntries(EntityState.Deleted)
.Select(entry => entry.Entity)
.OfType<ISoftDeleteEntity>()
.ToList();
}

private void SoftDelete(List<ISoftDeleteEntity> deletedEntities)
{
deletedEntities.ForEach(e =>
{
ObjectStateManager.ChangeObjectState(e, EntityState.Modified);
e.Deleted = true;
});
}
}
As you can see in the code snippet, first of all we are obtaining all the entities that are going to be deleted and that implement the ISoftDeleteEntity interface. After that, we are changing their state to modified and setting the Deleted property.

Note: If the changes were not detected yet (WCF DataServices scenario), you should call the DetectChanges method before obtaining the deleted entities. The attached source code shows how to do it.

You can download the sample code from here.