How do you know if you can take your open-source project to the next level?

693 days ago (wow), just shy of two years, I typed git push upstream master into Git Bash and pushed out the first public code of Nancy. Since then, and thanks to the fantastic response that we given by the .NET community, it’s been what I’ve spent the majority of my personal coding time on.
The last 6 months or so I’ve given more and more thought on how I could take Nancy to the next step. I mean I just turned 32 years old and perhaps I could spent my time on something more productive, like a service or application that had a price tag assigned to it?

Though Nancy is my passion. I truly believe there is enough room in the .NET market for ASP.NET alternatives, but could I make a living out of it? People seem to enjoy working with it, so why not? To some people that would be enough to quit their job and set off to work full-time in their project. That’s not me though. I’ve always been a bit cautious to life-changing decisions and even more so since I got married and had kids.

My main worry is that there is actually no real way to know if Nancy’s reach the critical mass that would be required to sustain a living of it. The obvious business possibilities would be to offer training, consultant services, try and get some speaks accepted at conferences and perhaps offer some sort of support.

Nik and Anthony, of Glimpse-fame, recently struck, what has to be seen as, the open-source jackpot when they were hired by Red Gate to work fulltime on Glimpse and the community it has built up. I’ve known Nik and Anthony for a while now and they’re awesome and it was a well deserved opportunity. Glimpse has helped out a lot of people and organizations and having a strong company, like Red Gate, supporting them is definitely going to make sure it helps even more.

But something like that is the hole-in-one of open-source, every golfer wants to make one, a lot claim they have, and very few have actually made it.

That said I know there are a lot of people out there that’s, successfully, transitioned their project from an evening-activity into a successful business. Maybe you’re one of them? Maybe you’re one of the people that also wonder how you can move forward? Would be awesome to hear from all of you what you think can be done.

The logical thing would be to ease into it. Sense the local market and see if there is someone in need of the services. But how would you reach out to them? Maybe I should kickstart it 😉

Blogging about this is probably the same thing as writing a self-fulfilling prophecy, because the people that read my blog are probably also people that in one way or another is (or have been) using Nancy on a fairly regular basis.

And yeah, pretty sure I’m having a bit of the open-source blues. The two year mark of an open-source project is the midlife crisis of open-source!

Nancy 0.12 released!

I am really happy to announce that we’ve released Nancy 0.12 into the “public”. To be fair all of our work is always in the public, but now we’ve actually slapped a new version on it and put it up on Nuget for easy access!

It took a bit longer than I would have liked, but it’s amazing with a little bit of summer and a month off from work will make time pass incredibly fast! There is an old Swedish proverb that says “"He who waits for something good never waits too long." which basically means “something good is worth waiting for” and I’d like to think it very much applies to this release.

It’s a massive release with 20 authors squeezing in 237 commits and helping with closing of 64 work items. Thank you, so much, to all that have contributed! You can find the full commit log here and the work items here. There is no way I can walk you through all 64 work items, that makes up this release, so I’ll give you a short highlight reel.

Content Negotiation – the star of the release

Getting support for Content Negotiation, into Nancy for this release, was the big thing for me and @Grumpydev and we really went into this with a clean slate and quite a lot of questions, mostly on how to preserve the Nancy syntax and make it really Super-Duper-Happy-Path.

I am happy to say that the syntax stayed exactly the same, it just got more powerful! The key difference is that you are no longer restricted to returning a Response object (or any of the types that are implicitly cast to one), but you can return what-ever-you-want. Booya!

If you return anything else than a Response object, the response will be subject to take part in content negotiation with the client that sent in the request. We provide a syntex to help you control the behavior, on a per-route level (should you need to) and application wide conventions.

Content Negotiation is built around the new IResponseProcessor interface, which is what you implement to add support for a new media range or model type, and Nancy will automatically detect it and wire it up for you!

Keep an eye on our Wiki for more detailed information, on Contente Negotiation, that should be coming up very soon. If you want to jump right in, you can either have a look at the Nancy.Demo.Hosting.Aspnet project, in our solution, and more specifically the /negotiated route.

You can also watch the aspConf talk, by Christian Horsdal on “The Lightweight Approach to Building Web Based APIs with .Net” where he shows of the content negotiation bits. The source code for his sample can be found at

Stateless authentication

A new, out-of-the-box, authentication method has been added by Byron Sommardahl and can be installed with the Nancy.Authentication.Stateless Nuget. For a demo, have a look at the Nancy.Demo.Authentication.Stateless project, in our main solution. serializer

Emil Cardell also contributed ISerializer and IBodyDeserializer implementations for JSON.NET, which can be installed using the Nancy.Serialization.JsonNet nuget package. Once installed it will automatically be picked up by Nancy.

No more NDjango out of the box

In 0.11 we removed it from the Mono Build Configurations and still had it build for Windows configurations. In 0.12 we made the decision to no longer support NDjango for a couple of reasons.

The development on the NDjango engine has been stale for quite a while now. Because of the fact that it’s built on F# has also always been a bit of a pain for our Mono support and with the release of .NET 4.5 and Windows 8 it’s added even more headaches that Grumpydev blogged about. If you wish to use NDjango then you can still do that.

It’s just an implementation of our IViewEngine interface (which is really light) so you can dig out the source code from our repository and just stick it in your project. You can even create and maintain your own Nuget for it, all we’re saying is that we won’t be the ones maintaining an NDjango engine anymore.

Lots of love for Nancy.Testing

For this release we gave the Nancy.Testing package some needed love. The whole list of changes that were introduced can be found here but I wanted to give highlight a couple of things

Convention based loading – Because of the way that .NET loads (or rather not loads) assemblies, that aren’t explicitly used, into the AppDomain, Nancy.Testing sometimes could not find these types. For instance if you test assembly A, referenced application assembly B, which in turn referenced helper assembly C then C will not be loaded into the application domain of the tests, unless one or more types from C where explicitly referenced from test assembly A. This is not a Nancy thing, but a .NET thing.

What we did was to have the ConfigurableBootstrapper help you get around this, but taking the name of the test assembly, attempting to remove “test”, “tests”, “specifications”, “specs”, “unittests” from the end of the name and make sure it, and all assemblies it references are loaded into the application domain.

This is not going to cover all scenarios, but it should cover the majority of them. As always, the naming conventions can be modified by setting the static TestAssemblySuffixes on the ConfigurableBootstrapper

Removed XUnit dependency – Before, some of our assertions were using xunit assertions internally, forcing a dependency on XUnit no matter if your tests were written in a different test framework or not. We’ve now removed this completely by adding our own assertions.

Now using CsQuery instead of HtmlAgilityPack – Our assertion helpers, that lets you assert on the HTML that is returned by a view, was previously using HtmlAgilityPack under the hood. We decided to swap over to CsQuery, which is a C# implementation of the jQuery selector syntax. Mostly because of some issues we were having, plus CsQuery has a cooler syntax. This should not have an impact on your test.

Auto-registration is now disabled by default – If you were using the ConfigurableBootstrapper, before 0.12, then it would use auto-registration, by default, to scan and register types into the container. You had the option to disable it to give you a (potentially huge) performance boots and make the tests more explicit. Starting with this release we flipped the behavior so that auto-registration is always turned of and you have to explicitly enable it with the EnableAutoRegistration option. We hope that it will help make the intent of the tests even clearer and at the same time make sure they are executed as fast as possible.

IStartup gets a new identity

You might have used the IStartup interface to perform a code action when the application started, for the first time, or to register some dependencies in the container. Looking at it, we thought it was a violation of the Single Responsibility Principle and decided to split the two actions into separate interfaces.

There ended up being IApplicationStartup for startup actions and IApplicationRegistrations for dependency registrations. You could always implement your own IStartup interface that simply inherited from IApplicationStartup and IApplicationRegistrations and the rest of your code could be left unchanged.

Razor error page

If you’ve ever encountered the Razor error page, that you get when there is an error in your view, then you know it does not offer much assistance to resolving the problem. We pimped the page so it’s now damn useful and should help you pinpoint your issue in seconds. Have a look your self!


Now that’s what I call awesometastic!

Breaking changes

Yep, unfortunately, there are some of those too. As always we try to keep them as few as possible and hopefully most of them won’t have an impact on your application. For a full list of the breaking changes, take a look at the list

Things to come

Looks like there are a huge list of cool things coming down the pipe. For the next release (which should happen within the month) we will be going over the 30+ open pull requests and pull in the ones we find awesome and suitable.

Looking a bit further down the pipe we have async routes, improved session and cache support which will let you hook up any provider you want (how about memcache, redis or a database?) very easily?!

.NET 4.5 : Operation could destabilize the runtime (yikes!)

Update: Microsoft has since posted an update to .NET Framework 4.5 which resolves this issue. For more information, please visit

The other week I got around to installing Visual Studio 2012 and .NET Framework 4.5 RTM bits on my machine – thought it was about time that I made the switch. Though after the installation I started getting a failing test in the Nancy test suite that said

Now I don’t know about you, but “Operation might destabilize the runtime” doesn’t sound like test best of things that could happen.

Following along the debugger for a ride though the code, I ended up at a line that performed an Activator.CreateInstance call on a type that was inheriting from AbstractValidator<T> from the (awesome) FluentValidation library. Code that hadn’t been changed in quite a while.

I was certain that this test didn’t fail prior to the upgrade and we had not updated out FluentValidation reference to a newer version. The test was failing in both Visual Studio 2010 and 2012, using both the TestDriven.NET and ReSharper Test Runner, so it had to be related to .NET 4.5 upgrade.

Even though the code was targeting the .NET Framework 4.0 it was still happening. Why is that? Well remember that 4.5 does not install side-by-side of 4.0, but instead it’s a drop-in replacement that is supposed to (there are some kinks to be worked out still) be backwards compatible. So even though you target 4.0, you are still running on the 4.5 runtime.

Luckily I was connected to Lee Coward on the CLR team and he was able to shed some light on the issue

The problem happens whenever the IL that the C# compiler emits for a derived class constructor contains one or more branches that appear before the call to the base class constructor.

So the problem exists on the IL level and is only detected when the CLR Verifier is executed on the code. The verifier makes sure that the IL is type safe before it’s sent to the JIT Compiler and if it detects and issue (like this) it will bark at you.
Furthermore, the constructor of the derived class has to be generic for all of this to happen. When these criteria are met, then there are quite a few scenarios that can cause the bug to be invoked.

Instead of trying to explain all of these, and possibly get any of the details wrong, I will share the sample that Lee Coward gave me

I’ve been told that they are working on a proper fix for this and also on guidance on which C# code patterns that will cause the verification error.

In the meantime I’ve submitted a pull-request to FluentValidation that resolves that prevents it from causing the verification errors and Jeremy Skinner has already put out a new version on Nuget. So if you are experiencing issues with .NET 4.5 and FluentValidation all you have to do is update to the newest version

Codeplex hates open-source

Did that get your attention? Good. I took my open-source baby steps with Codeplex and after about a year or two I discovered GitHub and I’ve literally not looked back since. So today I set out to send a pull request for a project that is hosted on Codeplex and I was amazed how alienated I felt. I’ve not spent a lot of time on Codeplex since it became all fancy with Git, Mercurial and a new “user-interface style formally known as Metro” design.

So I went to the project page and was presented with this


And I started looking for a way to fork the project. Can you see it? Look really careful! Still can’t see it? Me neither. That’s because there is no way to fork from the project page. Hmm odd.

Oh, as you can see, I am sending the awesome Jeremy Skinner for Fluent Validation

Since I couldn’t find a way to fork on the project page, I continued to head over the Source Page and saw this.


At first I thought I’d run into another dead end, but after some more scanning I found this


Not the most obvious thing is it? Why would they make something so important so obscure? Not only is it hidden away in the sub-pages for the project, but on the page they decided to put it, they made it so small and stuck it in the corner .. squeeeezing it in between the top menu and side menu!?

In the end I noticed Fluent Validation was also maintained on GitHub so I ended up just forking that repository instead

GitHub 1 – Codeplex 0

Nancy : 0.11, more than the sum of its parts

With out eleventh release just our the door, I was struggling to come up with a descriptive post title that captured the essence of the new version. Looking back at the 190 commits by 25 authors that make up the release,  "more than the sum of its parts" is an appropriate description.
A total of 78 contributors, with more lining up with pending pull requests for future versions, have helped make Nancy the awesome framework we have the pleasure to put to your disposal.

A glimps of what’s new

It would take up far too much space to write about the 62 work items that’s gone in to this release, so instead I will just cherry-pick some of them and present them here. You should visit the milestone at github and check out everything that’s gone into 0.11.

  1. Favicon override by just dropping a favicon.ico/png anywhere in your application path. Can still override the FavIcon property in the bootstrapper if you wan’t custom logic or use an embedded icon
  2. Updated the FluentValidation nuget dependency to Turns out the FluentValidation nuget contains a strong-named assembly
  3. Added AddFile to StaticContentConventionBuilder, enables you to map individual files
  4. Allow Nancy to run on machines with FIPS Compliance enabled
  5. Added our own ViewBag concept and exposed it on several of the view engines
  6. @helper functions are now supported by the Razor view engine
  7. HtmlHelpers/UrlHelpers gained a face lift and now expose a rich set of properties, such access to the model, the render context (thus the actual NancyContext itself) and the engine itself. This opens up the door to write pretty much any kind of helper extension you can envision

This is just a small selection of all the new features and improvements that have been added for this release, make sure to read the full change log.

Breaking changes

We really do try out best to limit the number of breaking changes that we introduce with each release. Some APIs are still being explored and as feedback comes in from users, we sometime feel the need to make some breaking changes in order to make the path forward easier on everyone, contributor or consumer. In this release we have 2 breaking changes (the milestone says 3, but 2 of them relate to the same change, just different areas in the code base).

  1. Removed Nancy-Version header. In hindsight this was nothing more than a framework vanity place, put in by yours truly. I do agree to the idea of the less information you expose, the smaller the attack vector will be on your application and with this in mind we decided to pull the version out starting with this release. You can add it back in with an application pipeline hook if you have a need for it. If you don’t know how to, just drop us a line at the user group and we’ll get you sorted in no time.
  2. Added NancyContext everywhere. This is probably only going to affect you if you have been extending any of Nancy’s sub-systems (error handling, authentication, model binding and so on). There were a couple of sub-systems where the context was not available, which made it difficult, if not impossible, to sometimes get the full potential out of your code. Hopefully this change will improve the experience by quite a lot. If you spot a place where the context would do good, let us know.

What’s up next

There are quite a lot of pending pull requests, with all sorts of interesting new features and improvements, that are waiting to be included in the next release. The main focus for me and Steven are going to be to add support for async routes and content negotiation. We have created a post, with the title Planning 0.12 : Content Negotiation to capture your ideas for how it should look and work in Nancy. We will be creating a similar post, soon, for async discussions.

We also have a goal of making more frequent releases, there is no point in holding back a release just because we want to get certain features into it, they can always come in the next iteration. We feel that it would be far more productive if we had a shorter (maybe every 3-4 weeks) release cycle so we get all the awesome stuff, that’s contributed by the community, into your hands faster.

That’s all for this release!

Nancy : Now with Mono builds on every commit

We’ve always tried to make sure that Nancy being able to run on Mono for every new release we put out. Since neither me or @Grumpydev use Mono or MonoDevelop as our primary development environment, nor does a lot of our contributors, we’ve always found ourselves having to play “mono-catch-up” at the end of each milestone.

Having that extra step in the release process is a bit of an impediment when all we want to do is ship the new bits, as fast as possible, so we can put them in the hands of our community. Obviously we needed a change in our process and it was obvious that we needed to incorporate a Mono build into our CI process.

This is actually not a new idea we’ve had. For probably the better part of a year I have been trying to figure out how to make this happen. We are proud users of the CodeBetter TeamCity server and it’s always worked well for us, with great support. So my goal has always been to get a Mono build agents wired into that.

For one reason or another, it’s always fallen short, that is, until now. About a month ago I decided to take another swing at this so I contacted the people at CodeBetter to check on the likelihood of having a Mono agent added. I’d also talked to the awesome Dale Ragan (of Moncai, Monkey Square and Monospace) if he’d be willing to share a bit of his time and knowledge around the subject, which is was more than willing to.

A couple of e-mails later, the plans had been made and Dale also informed us that Monkey Square would like to sponsor the build agent and cover the cost of the EC2 instance that it would be running on. How cool isn’t that of them!?!

Today we ran the first successful, automated, Nancy Mono build using the new build agent! This is going to make it so easier for us to make sure that everything is strawberries when it comes to Mono at the time we are ready to push out a new official version. We’ll get near instant feedback on each of our commits and every time we accept a pull request. This means we can act immediately to sort it out instead of risking to put them all on a pile (usually a very small pile.. more like a bump really, but still a list of things that needs to be sorted).

Nancy builds for .NET and Mono running on the Mono agent at the CodeBetter TeamCity server

(Yes, fewer tests on the Mono build because things like our WCF host is obviously not supported on Mono)

I would also like to make a shout-out to the awesome people at CodeBetter, especially
James Kovacs and Kyle Baley for their help in making this happen. A special thanks also goes out to Hadi Hariri, of JetBrains, for always helping out with my TeamCity questions and for encouraging me to push hard enough to make this a reality.

The open-source maturity model

The discussion, about what constitutes OSS and not, have been going on on Twitter and the .NET blogosphere for a couple of weeks now. The root of it all has been whether or not Microsoft should call their work with ASP.NET MVC as open-source or not. Reading what has been said and taking part in the discussion myself I feel that quite often the discussion is clouded by our individual thoughts on what constitutes as open-source or not, rather then what the actual definition states.

So what does the definition says? Well we can look at the Open-Source Definition, by the Open-Source Initiative, and the Free Software Definition, by the Free Software Foundation for guidance on that. I won’t recite any of sources because both have very clear definitions on their websites.

What’s missing from this picture?

There are a couple of things missing for both of them. Things like when and how often do you need to make source code publically available? Do you need to develop in the open, with public roadmaps and feature discussions? Do you have to accept code contributions or not?

For the most of us (?) those are no brainers; you should put code out in the public as quickly as you can, engage in discussions with your community and accept contributions with open arms as long as it is true to your vision.

However these are all values that we, as a community, layer on top of the definition of open-source and open-source software. These are things we have seen help increase the transparency in our projects, help improve quality and add more value to our work.

You can take all of that away and still do open-source, but you are selling yourself short (if you ask “us”) if you do.

What can we do?

You tell me! One idea I had tonight, while arguing about this on Twitter, was that maybe we need a way to measure the maturity of open-source participation of a company/product.

If you’ve ever read anything about REST you may have come across the Richardson’s Maturity Model for services on the web. Basically it’s a measuring stick for how far you’ve come with your REST adoption. Check out the link so read about the 4 levels of the model.

What if we could apply the same idea for open-source? What if we had something like this

  • Level 3. Accepts patches
  • Level 2. Make code available on a regular basis
  • Level 1. Develops in the open
  • Level 0. Compliant with the OSI / FSF definition of open-source

These maturity levels aren’t something I’ve been philosophizing about for a long time, in fact they popped into my head about 30 minutes ago while I was engaged in the Twitter discussion.

Just to be crystal clear; The number of levels and the the definition of each level is not something I would consider set in stone at the time of the writing.

Instead I hope they can inspire to some interesting discussion and perhaps even a consensus on what such a model should look like.

Maybe I’m just talking out of my ass here or maybe I am onto something. Either way, let me know in the comments. I personally thing something like this could help out when we, the community, talk about open-source and open-source software

Nancy v0.10.0 – The next step in awesome


A couple of days ago we (finally) managed to get v0.10.0 out of the door and it’s packed of goodies! Diagnostics, razor improvements, bug fixes, weighted request headers, model validation and lots of tweaks to existing features are only a couple of things that went into this release.

The community continue to bless us with their awesome work and this release had 22 authors totaling 250 commits, which is just jaw dropping! That puts us at 65 contributors to the Nancy source code. This release consisted of 23 updated (some new) Nugets and a total of 45 work items! For a complete list of things that’s part of this release (including breaking changes!), head over to check out the milestone at GitHub.

So let’s have a closer look at some of the stuff in v0.10.0!

Validating models just got easy

Craig Wilson contributed some absolutely beautify code to help us get model validation into Nancy. Like all other features in Nancy, the validation stuff is shipped as a set of Nugets with the added ability of hooking in your own validation methods. In this release we ship Nugets to use either DataAnnotations or Fluent Validation to validate your models. Combining this with our model binder and you are definitely on the super-duper-happy-path!

The contribution from Craig also contained the foundation to generate client-side validation based on your model rules, however this release does not contain the code to enable that but it’s definitely something we’ll be looking into in upcoming releases (maybe it will be your contribution that brings this to the community?!)

The Nancy.Demo.Validation project, in the main solution, shows you how to use this feature and uses both DataAnnotations and Fluent Validation. Remember, it’s quite easy to plug in other validation frameworks too!

Making sense of what the client sends you

We gave the RequestHeaders class some love, by adding methods to access all available header names and values through properties. We also made the class implement IEnumerable<> to make it easier to work with the incoming headers.

One of the biggest changes is the presence of weighted headers. The headers that can be weighted not returns IEnumerable<Tuple<string, decimal>>, instead of IEnumerable<string>, and will be sorted in descending order according to their weight. Not only will this enable you to manage weighted headers in a correct way, but it’s also an important piece of the puzzle for the planned content negotiation support, that we will ship in a later release.

Sharpening the Razor

The improvements, made to the Razor view engine, in this release really deserves a blog post of it’s own to make it justice. Despite that, I will give you a mile height overview of the changes and improvements that we’ve made.

One of the things that we (me and Grumpydev) really wanted to sort out for this release was adding in Intellisense for our razor views and I am very happy to say that we’ve managed to do that. It includes a couple of custom build providers that is installed, into the bin folder of your project, with the razor Nuget and adds a couple of post-build events to make sure they stay there. I would lie if I said that getting the build providers to play ball was easy. It’s one of those dark corners of Razor customization that nobody speaks of and it involves quite a bit of Visual Studio magic. So to enable Intellisense, all you have to do is build your project, after installing the Nuget, and the Nancy.ViewEngines.Razor.BuildProviders.dll assembly will be copied into your bin folder. The Nuget will also wire them up for you in the web.config, moar super-duper-happy-path for you!

While doing some refactoring work I also noticed how easy it would be to add support for .vbhtml files, so I did. This should be really handy to help you port that legacy MVC application over to Nancy 😉 The included build providers contains one for these views so you should be able to enjoy Intellisese support if you ever find yourself in the need to use the Visual Basic views with Razor.

Oh, speaking of dark corners of Razor, did you know that ASP.NET MVC installs a global handler for .cshtml and .vbhtml files? We didn’t. At least not until we got a bug report that Nancy wasn’t rendering views with the passed in model. Turns out that if you have a view, with the same name as one of the routes, the request would never be passed to Nancy. Instead the global handler would suck it in and render the view. How nice, right? WRONG! Well after speaking to the Razor team it turns out that you can disable this behavior with the, not so obvious, webPages:Enabled configuration key. We updated out Nuget to stick that into your config file when you install it.

Craig Wilson didn’t only chip in with awesome model validation code, he also added support for @model and @ModelType directives, as well as making the Razor engine smart enough to automatically reference the models assembly or as Craig described it in this pull request

The model’s assembly is now referenced automatically regardless of whether an assembly reference exists in configuration. This makes the "it just works" statement a little more complete. In addition, if specified in configuration, the namespace of the model is automatically included in the razor file.

That’s a very awesome little feature to have around!

We also dropped our dependency on System.Web in the engine which means it will run in client profile. It also means that if you took for granted that it was around in your views then you are going to have to reference the assembly and namespace in the Razor configuration. We thing it’s a small price to pay to get rid of that dependency!

All your Url are belong to us

Two small, but oh so useful, changes where made to the Url type. You can now call ToString() and it will give you a correct string representation of the url. The second thing is that the type will now implicitly cast to an Uri instance, how awesome is that!?

Not more White Screen of Death (WSOD)

Have you ever noticed that if you tried to render a view, that Nancy couldn’t find, you would end up with a blank page? Yeah, not really useful is it? To our defense that we never our intention but we somehow forgot to make it more awesome. That is, until now. As of this release, you will now get a view that tells you that it was unable to locate the requested view and it will also tell you which file extensions, based on the available view engines, that can be used. Also expect a future release to also contain information about the locations that were inspected for the view.

Doctor, what’s wrong with me?!

Do you remember the first time we talked about adding in Diagnostics for Nancy? To be honest I can’t say for sure, but I know it was somewhere around v0.3/0.4 and that wasn’t exactly yesterday. It’s been a long time coming, but I am very pleased to tell you that we’ve included diagnostics in this release! It is important to be aware that this release only contains a couple of diagnostics tools, but we’ve put a lot of work into the diagnostics foundation and it will be really easy for us (that that “us” definitely includes you too! We want your contributions!) to add more diagnostics tool for each new release we make.

The diagnostics contains both request tracing and interactive diagnostics. The interactive diagnostics is where things gets really interesting. It enables you to interactively (duh!) query Nancy, at runtime, to figure out what the heck is going in. Before you get too excited, this is not a query language like SQL where you can just fire of queries, even though it would probably be totally possible to implement (hmmm, any takers?). We are introducing the IDiagnosticsProvider interface and anything that implements this can hand our diagnostic capabilities.

These are, quite literally, normal methods that are exposed on our diagnostics dashboard. It enables you to pass in primitive values and have rich result sets be returned and presented automatically. Anything that implements the interface will automatically be discovered and wired up on the dashboard. Not only that, but you can take full advantage of the, 1st class, dependency injection support of Nancy and take in what ever dependencies you want into your providers, making it super easy to do some pretty advanced stuff. Of course you are not restricted to only take dependencies on Nancy types, there is nothing stopping you from exposing a nice interface over your logs etc.

The providers can, nicely, be dropped into a Nuget and reused across any Nancy project out there!

Wanna know something really neat? The entire diagnostics dashboard is build using Nancy and backbone.js and is embedded into Nancy itself (woah! A Nancy application running inside of Nancy?! Yep!)

It’s extremely fast and easy to use. So how do you get access to the diagnostics goodies? Glad you asked! All you have to do is to run in debug mode (although it’s possible to use it in release mode too, you just need to turn it on) and then browse /_Nancy/ in your application. It will give you some (very) easy instructions (really, it teaches you how to configure a password) to get started.

We’ve only scratched the surface of this!

Moar stuff!

Really, I can’t list em all here so head over to the milestone at GitHub.

So wuzz next?

Well, first of all you won’t have to wait as long for the next release. We are going to try to get this out in about 3 weeks time and plan on focusing on trimming down the pull request queue and fix the reported bugs. We have no major features planned for the next release, but that does not mean that we’re not working on any. Quite the opposite, we are working on getting in support for async routes, content negotiation and proper caching support. We just don’t want to keep other stuff away from you while we work on these things and want to keep doing more frequent releases.

Of course there is always the chance that we manage to finish one of them in time for the next release, or that someone contributes something really awesome that will be included!

Thank you for making v0.10.0 possible!