How NuGet is Helping Us Realize a New Level of Continuous Integration

As I was working on cleaning up build scripts and Jenkins build projects this week, I was considering a blog post on how we use NuGet and the benefit it brings for us. I hadn’t seen anything around about the internet that resembled our approach. Then this morning, I noticed Douglas Rathbone explaining why NuGet Package Restore is not for him. So, I’ll take the opposite position and explain why NuGet Package Restore (and then some) is for us.

We Used to Check-in Libraries

My context may be different from Douglas’s. I mainly work with many independent, open-source products that represent tools and integrations built to work with the main VersionOne application. Historically, these projects were checked into Subversion and the dependencies were managed by having a special directory structure for external libraries. Groups of projects could depend on a peer directory that held shared libraries. In this model, we could update a dependency in one place and all the projects would get the new library.

We have recently begun moving these projects to GitHub, which changes a number of things. First on my mind is that we want to avoid checking in libraries. For Git, this is helpful to keep repositories small and fast without having to do maintenance. This also helps us reduce the risk that we break the terms of an open-source or commercial license for a library we consume. In Git, the equivalent feature for managing shared libraries is the submodule. While capable of recreating the relationships we had, we felt submodules were a little too obscure and could be a barrier to external participants who might be new to Git.

NuGet Package Restore

It isn’t perfect, but NuGet Package Restore is hardly as bad as Douglas claims. Douglas suggests build speed suffers due to NuGet restore. This might be true when you have fast local network speeds to retrieve files from version control, but slow Internet bandwidth with many, large NuGet packages. That doesn’t fit for us, since both GitHub and NuGet are hosted, libraries are going to be about as fast from either. Moreover, if there are binary deltas, Git will actually loose the speed contests. Douglas also overlooks NuGet caching and the option to run your own, internal NuGet feed, which both solve for both performance and reliability.

In our experience, the pain comes from other places. We wanted to keep binaries out of the Git repo, but Package Restore wants to put the NuGet.exe inside. We have tried some other approaches here but nothing is quite satisfying. Also, as NuGet changed versions, we found changes in command-line switches and the API meant some reconfiguration. Most of these changes came with 2.0, which is to be expected on a major version change.

Controlling Change vs Embracing Change

Douglas also claims that NuGet reduces control over dependency versioning. His logic is hard to refute. After all, how can you prove a copy is really a copy? However, I see no reason to trust that your version control system gives you a better copy than NuGet. The dependencies in the NuGet packages.config file are quite explicit and that file should be under version control. The packages.config file also remembers the specific version that is required. Package restore does not automatically update dependency versions all on its own.

We like explicit, versioned control over our dependencies. We also want to know as soon as possible when a change in an external library may break our code. The XP description of continuous integration tells us:

Continuous integration avoids or detects compatibility problems early. Integration is a “pay me now or pay me more later” kind of activity. That is, if you integrate throughout the project in small amounts you will not find your self trying to integrate the system for weeks at the project’s end while the deadline slips by. Always work in the context of the latest version of the system.

We take that to mean integration with all of the components, regardless of whether they are internal or external. NuGet gives us a means to automatically integrate new versions of libraries with the command-line nuget.exe update. In addition to restoring packages, we have Jenkins update them automatically so that we keep up with changes. The only pain around this is that we manually edit the packages.config to include allowedVersions. Following the rules of semantic versioning, we expect to work with the current version up to the next major version (i.e. the next breaking change). For example, for NUnit, we specify:

<package id="NUnit" version="2.6.2"
         targetFramework="net40" />
<package id="NUnit" version="2.6.2"
         targetFramework="net40" />

Let the Consumers Decide Between Stability and New Features

We don’t just consume libraries, we also publish them. While we are committed to accepting changes from external libraries, we realize that not everyone is prepared to do so. For ourselves and anyone who wants to try out the latest features, we publish continuous integration builds to our own MyGet feed. For more stable builds, we do additional testing on specific builds then use Jenkins build promotion to publish to Since the NuGet feed is already available in Visual Studio, our stable feed is the default. The explicit action of adding a new NuGet source, helps make sure consumers know what they are getting into when they point to our more bleeding-edge MyGet feed.

Learn Your NuGet!

While Douglas is right that NuGet causes problems that didn’t exist before, there are benefits to be obtained that can’t be had with manual management of libraries through version control check-ins. NuGet isn’t magic, it’s a tool. So there is an investment in learning how to use it. For us, automatic updates as one less thing we have to think about and early warning about breaking changes have already paid for that investment.

This entry was posted in Our Practices. Bookmark the permalink.

2 Responses to How NuGet is Helping Us Realize a New Level of Continuous Integration

  1. Andy Alm says:

    Great writeup. I am always very interested in how people are managing their dependencies and doing CI. At my company, we have a somewhat similar setup as you do. I have to ask though, how much noise is being added to your source control history by running ‘nuget.exe update’ all of the time? Since the version of the package is in both the packages config file as well as the hint path of the aassembly reference, it feels like half of our commits are just changing those values, which I find highly annoying. Do you notice that happening in your context?

  2. Ian Buchanan says:

    @Andy, although we check what the latest libraries do to our build, we don’t automatically commit package updates. So, we don’t see that kind of noise. Our thinking is that package updates should still be updated manually. For example, if an update to NUnit 2.6.3 broke a build because we depended on a fixed defect in some way, then we need to fix our code, update the dependency, and move our semantic versioning. In contrast, if 2.6.3 does not break a build, then we can leave the dependency on 2.6.2 since the later version is indistinguishable for our purposes. In short, it’s just early warning for us, not blind updates. I’m really glad you pointed out your “side effect” because I was considering the more aggressive blind updates.

Leave a Reply

Your email address will not be published. Required fields are marked *