How to set up Visual Studio 2013 projects for building nuget packages targeting multiple .Net versions? - visual-studio

We have several different teams building C# applications with Visual Studio. When we want to share libraries across teams, we create nuget packages for the libraries and add them to a local nuget feed.
The process we use to package our libraries is very simple: we create a .nuspec for the library, and then run nuget on the project .csproj to create the package.
This results in a package that is specific to the .Net version (4.0, 4.5, 4.5.1) selected for the .csproj for the project. We've pretty much standardized on 4.5 to deal with this.
Many publicly available nuget packages provide simultaneous support for different library versions, and we'd like our packages to do the same to make it easy for each of our teams to select .Net versions appropriate for them. I know in principle how to build a package this way-- but it involves moving files around to different folders and invoking the nuget packager at a lower-level. I don't know of a way to automate this in a way that could be picked up easily across our teams.
So my question is: is there an easy/standard way of setting up a library project in Visual Studio so it produces a cross-version-compatible Nuget package?

Not a direct answer to your question, I know this, but...
Why target multiple versions of the framework at all?
When NuGet installs a package that has multiple assembly versions, it
tries to match the framework name of the assembly with the target
framework of the project. If a match is not found, NuGet copies the
assembly that's for the highest version that is less than or equal to
the project's target framework.
Matching assembly version to the target framework of a project
What we've done is figure out what our lowest common denominator is, both for our libraries and our organization. In our case, we have some older apps still running on the 3.5 framework, so any NuGet packages we have that need to be available to any/all projects also target the 3.5 framework. However, we also have a few libraries that are only needed by some newer apps, these target 4.5 (both the projects and the libraries). This lets us leverage newer features.
If we find ourselves in a situation where an older app needs to reference a package that must reference the newest version to work, we bite the bullet and upgrade the project. However, for our libraries/packages, we always target the oldest version possible. Basically, the split is where we want/need to leverage Async/Await.
TL;DR: Don't target multiple frameworks. Target the lowest common denominator. It provides motivation to upgrade those apps lagging behind on 3.5 or 4. (Or, God forbid, 2.0...)

If you really need to support different framework versions, you will first have to create different configurations for each framework version (in your .csproj files), utilizing the configuration manager. Similar to the standard configurations "Debug" and "Release", or "x86_Debug" etc. you can add configurations by yourself like Release_Fw_40, Release_Fw_45, Release_Fw_451.
When packaging with Nuget, you can use the parameter
-Prop Configuration=Release_Fw_40
to choose which configuration you want to build, as described in the Nuget docs. There are also some hints how to automate the package builds, including support of different configurations.
Note that this will impose some additional effort for your library maintainers to manage those many configurations. It should be obvious that even if you provide a "Fw 4.5.1" version of your lib, you can only use Fw 4.0 features in the source code as long as you want to support a Fw 4.0 configuration. So make sure what you are trying is really worth the hassle.
I know in principle how to build a package this way-- but it involves moving files around to different folders and invoking the nuget packager at a lower-level. I don't know of a way to automate this in a way that could be picked up easily across our teams
I am not sure what you meant by this sentence, maybe I am telling you only things you already knew. But "moving files around to different folders" and "invoking the nuget packager at a lower-level" are things which can be very easily scripted. For such tasks, you can use simple Windows shell scripting or Powershell scripting, whatever you prefer.

Related

How do you manage dependencies with inversion of control?

I am a small software developer who builds a plugin-based app. Each plugin is its own assembly and references the main assembly like so:
Plugin 1 references nuget packages and Application Core.dll
Plugin 2 references nuget packages and Application Core.dll
At run time, the main assembly loads all the assemblies in its current folder and enumerates them for plugins. Right now, all my projects build to the same folder. I like doing that because I can just press F5 to run my solution in Visual Studio. This is also nice because I can just disable a project in the solution if I don't want to run that plugin.
Some plugins now require a specific version of a nuget package. For example:
Plugin 1 requires System.Net.Http 4.0.0 or above
Plugin 1 requires System.Net.Http 4.3.4 or above
When I build my project, sometimes plugin 2 builds first which means that plugin 1 or 2 will overwrite system.net.http 4.3.4 with version 4.0.0.
What is the best way for me to avoid overwriting DLLs with older versions but still have the ease-of-development that I'm used to?
One solution, which you may not like, is to stop using dynamic loading. If your app uses Plugin 1 and Plugin 2, make them NuGet or project references and let the build system figure out minimum versions of all common packages for you. You will also no longer need to change all the build output paths to get an easy development experience. Just use feature-flags/configuration to allow customers to choose the features they want, rather than adding/deleting dlls.
Another option is to keep each plugin in their own folders with all of their own dependencies and the main program loads plugins in different App Domains. this only works on .NET Framework, not .NET Core, but it does have the advantage that a plugin that crashes has less chance of crashing the whole app. the program may be able to detect a crashed plugin/app domain and unload it. But app domains are almost like using different processes and you have inter-domain communication issues (everything must be serialisable)

What is the impact of not having synchronized Nuget packages between Xamarin Forms PCL + iOS + Android

Xamarin forms allows for a shared PCL library that also is able to have nuget packages applied to it. What is the impact of having non-synchronized packages in iOS, the PCL, and Android?
At build time which package takes priority?
How can I update a single Nuget package across all projects, without having to click them all. (Update all packages is a no-go for me, I require some older libraries)
Majority of the information you're seeking is within the NuGet/Microsoft docs:
https://learn.microsoft.com/en-us/nuget/consume-packages/dependency-resolution
As for non-synchronized packages, it can be quite troublesome as mixing versions is usually a bad idea. This can be a PCL API that invokes the wrong Project specific API. In my experience, do not mix and match any versions. This includes Xamarin.Forms, Google Play Services, Support Libraries. Make sure they are all consistent across the board or else you'll have a bad time.
Updating the same NuGet package across all projects can be as simple as using the NuGet Package Manager at the .sln level(Right click a solution that includes all the projects and use Manage NuGet Packages). Be wary that dependencies need updating as well, so ensure you are updating the parent package so it can handle the dependency resolution.

Is there a way to support snapshots with NugGet and native libraries?

We use Visual Studio to write and maintain native Windows apps. We are looking into using NuGet to handle our dependencies, which consist of native static libs.
After some research, I've managed to use NuGet, package.config and the CoApp PowerShell scripts to create and consume NuGet packages with native libs in them. The issue we're facing right now is that we need to have Snapshot support.
The rollover PreRelease mechanism (with * for version rollover) that NuGet 3 and onwards supports looks great, however, it seems to only work with project.json and not with package.config. Project.json, however, doesn't seem to work with native packages, as they don't get installed in the local solution folder so the build can't find the headers and libs.
The question boils down to:
Is there a way to use project.json and NuGet 3 with native static libs?
If not, then, what alternatives are out there to support this use case? (The use case being build-time dependency distribution for native, unmanaged Windows static libraries).
EDIT:
At the end, we decided to use Maven for dependency management since NuGet doesn't seem to support our use case. I filed an issue about two weeks ago but it hasn't received any response. However, if we had decided to force NuGet into our use case, the solution proposed by Wendy would probably be the way to go, so I'm accepting it.
There are two ways could add content files into project that uses project.json file. One is "contentFiles" node and another is "files" node in nuspec file. Detailed steps please refer to:
http://blog.nuget.org/20160126/nuget-contentFiles-demystified.html
But please note, these ways only support UWP and Portable class libraries.
This feature is only for packages that will be installed to projects that are managed using a project.json file. Currently only two projects types are managed by a project.json.
1.UWP apps
2.Portable class libraries
The contentFiles option is not available for other project types
If you are using .NET Core application or other type project that use project.json, the content files in nuget package is not supported added into project at present.

Maintaining VC++ projects across several Visual Studio Versions?

We have a Windows Desktop only product suite that consists of several .exe applications and (obviously) quite a bunch of shared libraries between these apps. There's about 20 exe apps and maybe also about 20 shared libraries that are each used by several of these apps (some libs are very specific, some are just your good(?) old FooUtils.dll) Most code is C++, some C#.
Currently, all these reside in a single Visual Studio 2005 solution and are all built and released together. That works quite fine and each developer always can edit / see / debug any code he needs to. About 15 devs (mixed C++ / C#) on that product suite.
Now the problem
Due to migration pressure (language features, new 3rd party components) it becomes increasingly urgent to migrate some of the apps (read .exeprojects) to newer visual studio versions. Some to VS 2010 maybe (3rd party dependencies) some to VS2015. And some we just have not the resources to migrate them from VS2005 yet (third party constraints, as well as time/budget constraints).
This means that several of the shared C++ libraries will have to exist for several visual studio versions. (As opposed to the exe projects - these would just be built/maintained for one chosen VS version.)
Now the question
Given a set of (internal) shared libraries that need to be created for multiple different Visual-C++-Versions and that should be easily editable and maintainable for all devs, how are we to keep the Utils.vcproj(2005) the Utils.vcxproj(2010) and the Utils.vcxproj(2015) in sync?
Mostly to avoid manually having to maintain all files contained in the projects, but also regarding project settings / .[vs]props settings for these projects.
Ideas we had so far:
Just accept annoying triple maintenance of 3 project files in 3 different solutions (ugh.)
Use one of the vc.. project files as the master project and automatically (how??!) generate the other vc..files from it.
Use tools like CMake, Premake, ... ?? to generate these shared-shared library projects. (That would mean introducing a foreign configuration tool for our devs)
If you want to avoid manually updating your project files in separate versions of Visual Studio, you'll have to have a master configuration for the project of one sort or another. The two options you listed are basically your options:
Use one project version as the 'master'. In this case, the master must be the oldest version (VS2005 in your case?). Visual Studio has a project upgrade feature, to convert older projects to newer versions. When you load up an older project in a newer version, it prompts you to upgrade. This process can be automated with some simple scripting. Using devenv.exe <project/solution file> /upgrade you can upgrade a project from the command line.
Use CMake/Premake/etc. These add a little bit of overhead, but making supporting new platforms and configurations a lot less painless. If adding new dev tools is cumbersome to your process, and you're only supporting Visual Studio, the first option might be more suitable.

Why use NuGet over installing libraries directly on my machine

I don't get it - can someone please explain to me why I should use NuGet rather than installing a bunch of libraries via a setup.exe or MSI? What advantage is there?
For example is it better to install Entity Framework 4.3 via NuGet rather than downloading the setup? Also, if I install entity framework via NuGet then is it available to any new solutions or projects that I create (bit confused here).
Basically what does NuGet do that a normal install doesn't do (or vice versa!)
Besides making it simple to add a package to your project, I think NuGet's biggest advantage is dependency management.
NuGet allows project owners to package their libraries as packages. Before, if they depended on other libraries like log4net, they would include those assemblies in their setup/zip file and upload to their web site.
With NuGet, they simply add a reference to these external packages in the .nuspec file. When NuGet installs the package, it will see that there are dependencies and will automatically download and install those packages as well. It also supports conflict management so that if 2 packages depends on different versions, it will figure out the correct one to install.
I think the best way to determine if NuGet will work for you is to actually try using it. I'm sure that once you do, you'll realize that it has many benefits.
Nuget provides several additional benefits:
it automatically configures your projects by adding references to the necessary assemblies, creating and adding project files (e.g. configuration), etc.
it provides package updates
it does all of this very conveniently
What advantage is there?
Nuget simplifies third libraries incorporation : With a single command line (Install-Package EntityFramework) you make your package available for your project. Instead of googling-find the package-download-setup-reference the package in your project...
Auto-Update is not mandatory, Nuget configuration file let you specify the version, or the range of version, that your application is compatible with.
Also, if I install entity framework via Nuget then is it available to any new solutions or projects that I create
Once you installed a package, dlls are copied in a directory at solution level, you can then reference them from there in others projects of your solution.
For each new solutions, re-installing packages is a better solution. As it is very easy with nuget, it won't be a problem.
Nuget contributes to creating a DLL hell and makes the solution go out of control very quickly, especially when different versions of so called "packages" come into play. Apart from assembly versioning, there are now nuget package versions. Nuget is just adding another wrapper over DLLs and does nothing that would make developers' life easier.

Resources