2

Had an interesting discussion with our architect. It was related to replacing a plain DLL reference with a NuGet package. His worry was "If it is possible for single NuGet package to add multiple DLL references, then NuGet package authors can decide to add a new DLL to the package. So when we update the NuGet package a new DLL is added, but our installer won't know about it, so it won't include it in the installation. And this problem will be revealed only when testers get to test the installed build of the software". And this would be reason not to use NuGet, as using plain binary references would make things obvious when new DLL was added.

My stance on this problem is that the chance of this happening is too small to bother with. That NuGet package authors would consider this a breaking change and only make this change in a major release. And that mitigating this risk should not be by not using NuGet, but by creating test automation that stresses our installed software.

The question is : What is risk of the above happening? Of a NuGet package adding new DLL as a project reference in non-major release?

Euphoric
  • 38,149

2 Answers2

2

Whenever you include 3rd party components in your own software system, updates for those components can break your system, whether those updates include a new DLL or not.

This happens even if the authors of those components act in good faith and try to be backwards compatible within one version line. It is not a hypothetical situation - software has bugs, and sometimes those bugs only reveal themselves in your specific environment.

So I would heavily recommend against configuring your build process to download always automatically the "latest and the greatest" version of 3rd party components somewhere from the internet and let them add updates to your system in an uncontrolled manner. If you want to use NuGet in a safe way, you have to make sure it downloads always a specific version of the 3rd party component.

If you want to update to a newer version, someone of your team

  • actively sets a switch which version this will be

  • initiates a suitable QA process (ideally with lots of test automation, of course, but also some reviews, and some information to all which might be affected by the update)

I am not an expert on NuGet, but AFAIK the safest way of doing this is by setting up your own, private NuGet server and providing exactly the versions of the 3rd party components you are going to use for your system, nothing else.

To some degree you have to rely on 3rd party software and their updates, like the .Net framework from Microsoft itself (especially since Microsoft started to deploy in-place updates to the 4.x version line directly on the end users systems). But I would not conclude that this is also a good approach for arbitrary 3rd party components.

Hence, whenever one is able to keep the updates of components under one's own control, I would recommend to make it so.

Doc Brown
  • 218,378
0

What is risk of the above happening?

The probability certainly is not 0.

However, it also goes somewhat deeper in that any of the dependencies in the dependency hierarchy may add references/packages. Even when this does happen it may not require a major release if the functionality in the contained package is not a breaking change.

The best, as you have stated, is to mitigate the risk of missed assemblies by performing some sanity checks on installed/deployed software.

Eben Roux
  • 226